[ 660.566404] env[67927]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 661.207488] env[67977]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 662.558327] env[67977]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=67977) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 662.558687] env[67977]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=67977) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 662.558810] env[67977]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=67977) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 662.559091] env[67977]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 662.754633] env[67977]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=67977) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 662.765440] env[67977]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=67977) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 662.870212] env[67977]: INFO nova.virt.driver [None req-8db566d9-bca9-4223-8c0b-0ba41d700276 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 662.942080] env[67977]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 662.942246] env[67977]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 662.942346] env[67977]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=67977) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 665.777364] env[67977]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-8bcc3c25-9da4-4c2d-8bed-c91a66d81ade {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.793885] env[67977]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=67977) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 665.794085] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-6d261645-02f4-457a-be1d-763acb7bf1d6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.832506] env[67977]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 136a1. [ 665.832667] env[67977]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.890s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 665.833166] env[67977]: INFO nova.virt.vmwareapi.driver [None req-8db566d9-bca9-4223-8c0b-0ba41d700276 None None] VMware vCenter version: 7.0.3 [ 665.836588] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-883c2bb6-2f77-44a4-8c2d-381e37a04aec {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.853848] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-921c5914-fb65-4fe5-a191-dcaab222cf83 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.859761] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-789ada48-7c43-4dac-8d28-a593721ef3b1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.866117] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68862044-0523-4349-a2ce-106fde8ece37 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.879204] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44fca4b3-1546-45bc-adb0-f9ee90da7d53 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.885063] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9ccb402-f0eb-4724-86ca-e1ab903e3c7e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.915278] env[67977]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-2bc02361-b0e3-479b-8bf9-59bdbb6ec4d4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.920157] env[67977]: DEBUG nova.virt.vmwareapi.driver [None req-8db566d9-bca9-4223-8c0b-0ba41d700276 None None] Extension org.openstack.compute already exists. {{(pid=67977) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 665.922723] env[67977]: INFO nova.compute.provider_config [None req-8db566d9-bca9-4223-8c0b-0ba41d700276 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 665.941356] env[67977]: DEBUG nova.context [None req-8db566d9-bca9-4223-8c0b-0ba41d700276 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),9399fcf0-19b6-49ce-9c29-f4127b904367(cell1) {{(pid=67977) load_cells /opt/stack/nova/nova/context.py:464}} [ 665.943332] env[67977]: DEBUG oslo_concurrency.lockutils [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 665.943566] env[67977]: DEBUG oslo_concurrency.lockutils [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 665.944284] env[67977]: DEBUG oslo_concurrency.lockutils [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 665.944706] env[67977]: DEBUG oslo_concurrency.lockutils [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Acquiring lock "9399fcf0-19b6-49ce-9c29-f4127b904367" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 665.944901] env[67977]: DEBUG oslo_concurrency.lockutils [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Lock "9399fcf0-19b6-49ce-9c29-f4127b904367" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 665.945873] env[67977]: DEBUG oslo_concurrency.lockutils [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Lock "9399fcf0-19b6-49ce-9c29-f4127b904367" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 665.970386] env[67977]: INFO dbcounter [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Registered counter for database nova_cell0 [ 665.978749] env[67977]: INFO dbcounter [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Registered counter for database nova_cell1 [ 665.981837] env[67977]: DEBUG oslo_db.sqlalchemy.engines [None req-58574229-6676-4524-98f8-9066d8f1487f None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67977) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 665.983155] env[67977]: DEBUG oslo_db.sqlalchemy.engines [None req-58574229-6676-4524-98f8-9066d8f1487f None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67977) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 665.986516] env[67977]: DEBUG dbcounter [-] [67977] Writer thread running {{(pid=67977) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 665.987800] env[67977]: DEBUG dbcounter [-] [67977] Writer thread running {{(pid=67977) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 665.990027] env[67977]: ERROR nova.db.main.api [None req-58574229-6676-4524-98f8-9066d8f1487f None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 665.990027] env[67977]: result = function(*args, **kwargs) [ 665.990027] env[67977]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 665.990027] env[67977]: return func(*args, **kwargs) [ 665.990027] env[67977]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 665.990027] env[67977]: result = fn(*args, **kwargs) [ 665.990027] env[67977]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 665.990027] env[67977]: return f(*args, **kwargs) [ 665.990027] env[67977]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 665.990027] env[67977]: return db.service_get_minimum_version(context, binaries) [ 665.990027] env[67977]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 665.990027] env[67977]: _check_db_access() [ 665.990027] env[67977]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 665.990027] env[67977]: stacktrace = ''.join(traceback.format_stack()) [ 665.990027] env[67977]: [ 665.991190] env[67977]: ERROR nova.db.main.api [None req-58574229-6676-4524-98f8-9066d8f1487f None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 665.991190] env[67977]: result = function(*args, **kwargs) [ 665.991190] env[67977]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 665.991190] env[67977]: return func(*args, **kwargs) [ 665.991190] env[67977]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 665.991190] env[67977]: result = fn(*args, **kwargs) [ 665.991190] env[67977]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 665.991190] env[67977]: return f(*args, **kwargs) [ 665.991190] env[67977]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 665.991190] env[67977]: return db.service_get_minimum_version(context, binaries) [ 665.991190] env[67977]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 665.991190] env[67977]: _check_db_access() [ 665.991190] env[67977]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 665.991190] env[67977]: stacktrace = ''.join(traceback.format_stack()) [ 665.991190] env[67977]: [ 665.991668] env[67977]: WARNING nova.objects.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 665.991699] env[67977]: WARNING nova.objects.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Failed to get minimum service version for cell 9399fcf0-19b6-49ce-9c29-f4127b904367 [ 665.992138] env[67977]: DEBUG oslo_concurrency.lockutils [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Acquiring lock "singleton_lock" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 665.992305] env[67977]: DEBUG oslo_concurrency.lockutils [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Acquired lock "singleton_lock" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 665.992554] env[67977]: DEBUG oslo_concurrency.lockutils [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Releasing lock "singleton_lock" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 665.992886] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Full set of CONF: {{(pid=67977) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 665.993043] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ******************************************************************************** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 665.993177] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] Configuration options gathered from: {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 665.993315] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 665.993511] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 665.993637] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ================================================================================ {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 665.993847] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] allow_resize_to_same_host = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.994030] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] arq_binding_timeout = 300 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.994167] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] backdoor_port = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.994295] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] backdoor_socket = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.994462] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] block_device_allocate_retries = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.994631] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] block_device_allocate_retries_interval = 3 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.994803] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cert = self.pem {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.994972] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.995159] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute_monitors = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.995327] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] config_dir = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.995501] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] config_drive_format = iso9660 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.995637] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.995801] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] config_source = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.995970] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] console_host = devstack {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.996151] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] control_exchange = nova {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.996312] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cpu_allocation_ratio = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.996473] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] daemon = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.996675] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] debug = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.996845] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] default_access_ip_network_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.997023] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] default_availability_zone = nova {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.997187] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] default_ephemeral_format = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.997347] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] default_green_pool_size = 1000 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.997587] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.997753] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] default_schedule_zone = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.997910] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] disk_allocation_ratio = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.998083] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] enable_new_services = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.998288] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] enabled_apis = ['osapi_compute'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.998426] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] enabled_ssl_apis = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.998588] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] flat_injected = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.998749] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] force_config_drive = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.998907] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] force_raw_images = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.999090] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] graceful_shutdown_timeout = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.999256] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] heal_instance_info_cache_interval = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.999502] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] host = cpu-1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.999686] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] initial_cpu_allocation_ratio = 4.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 665.999854] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] initial_disk_allocation_ratio = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.000031] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] initial_ram_allocation_ratio = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.000261] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.000441] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] instance_build_timeout = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.000609] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] instance_delete_interval = 300 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.000779] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] instance_format = [instance: %(uuid)s] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.000947] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] instance_name_template = instance-%08x {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.001123] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] instance_usage_audit = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.001299] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] instance_usage_audit_period = month {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.001468] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.001642] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] instances_path = /opt/stack/data/nova/instances {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.001812] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] internal_service_availability_zone = internal {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.001970] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] key = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.002175] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] live_migration_retry_count = 30 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.002352] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] log_config_append = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.002525] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.002690] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] log_dir = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.002852] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] log_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.002983] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] log_options = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.003164] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] log_rotate_interval = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.003337] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] log_rotate_interval_type = days {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.003505] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] log_rotation_type = none {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.003638] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.003767] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.003935] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.004115] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.004246] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.004408] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] long_rpc_timeout = 1800 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.004569] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] max_concurrent_builds = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.004726] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] max_concurrent_live_migrations = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.004884] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] max_concurrent_snapshots = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.005051] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] max_local_block_devices = 3 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.005212] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] max_logfile_count = 30 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.005370] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] max_logfile_size_mb = 200 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.005530] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] maximum_instance_delete_attempts = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.005696] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] metadata_listen = 0.0.0.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.005867] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] metadata_listen_port = 8775 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.006063] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] metadata_workers = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.006251] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] migrate_max_retries = -1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.006425] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] mkisofs_cmd = genisoimage {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.006637] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] my_block_storage_ip = 10.180.1.21 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.006772] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] my_ip = 10.180.1.21 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.006936] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] network_allocate_retries = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.007130] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.007306] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] osapi_compute_listen = 0.0.0.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.007468] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] osapi_compute_listen_port = 8774 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.007642] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] osapi_compute_unique_server_name_scope = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.007809] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] osapi_compute_workers = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.007971] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] password_length = 12 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.008170] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] periodic_enable = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.008310] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] periodic_fuzzy_delay = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.008484] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] pointer_model = usbtablet {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.008656] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] preallocate_images = none {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.008820] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] publish_errors = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.008955] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] pybasedir = /opt/stack/nova {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.009130] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ram_allocation_ratio = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.009294] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] rate_limit_burst = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.009490] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] rate_limit_except_level = CRITICAL {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.009668] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] rate_limit_interval = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.009834] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] reboot_timeout = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.009996] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] reclaim_instance_interval = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.010205] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] record = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.010390] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] reimage_timeout_per_gb = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.010564] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] report_interval = 120 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.010732] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] rescue_timeout = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.010896] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] reserved_host_cpus = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.011075] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] reserved_host_disk_mb = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.011244] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] reserved_host_memory_mb = 512 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.011408] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] reserved_huge_pages = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.011575] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] resize_confirm_window = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.011737] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] resize_fs_using_block_device = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.011899] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] resume_guests_state_on_host_boot = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.012081] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.012250] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] rpc_response_timeout = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.012415] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] run_external_periodic_tasks = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.012590] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] running_deleted_instance_action = reap {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.012756] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] running_deleted_instance_poll_interval = 1800 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.012917] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] running_deleted_instance_timeout = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.013089] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler_instance_sync_interval = 120 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.013260] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_down_time = 720 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.013433] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] servicegroup_driver = db {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.013599] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] shelved_offload_time = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.013761] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] shelved_poll_interval = 3600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.013931] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] shutdown_timeout = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.014128] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] source_is_ipv6 = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.014309] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ssl_only = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.014564] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.014737] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] sync_power_state_interval = 600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.014900] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] sync_power_state_pool_size = 1000 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.015087] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] syslog_log_facility = LOG_USER {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.015253] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] tempdir = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.015417] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] timeout_nbd = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.015612] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] transport_url = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.015799] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] update_resources_interval = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.015968] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] use_cow_images = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.016147] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] use_eventlog = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.016315] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] use_journal = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.016480] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] use_json = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.016644] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] use_rootwrap_daemon = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.016805] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] use_stderr = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.016968] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] use_syslog = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.017144] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vcpu_pin_set = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.017318] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plugging_is_fatal = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.017487] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plugging_timeout = 300 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.017658] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] virt_mkfs = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.017820] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] volume_usage_poll_interval = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.017984] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] watch_log_file = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.018205] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] web = /usr/share/spice-html5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 666.018412] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_concurrency.disable_process_locking = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.018713] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.018895] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.019080] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.019261] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_metrics.metrics_process_name = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.019453] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.019638] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.019826] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.auth_strategy = keystone {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.019998] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.compute_link_prefix = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.020194] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.020374] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.dhcp_domain = novalocal {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.020547] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.enable_instance_password = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.020718] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.glance_link_prefix = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.020887] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.021074] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.instance_list_cells_batch_strategy = distributed {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.021245] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.instance_list_per_project_cells = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.021413] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.list_records_by_skipping_down_cells = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.021587] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.local_metadata_per_cell = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.021759] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.max_limit = 1000 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.021930] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.metadata_cache_expiration = 15 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.022143] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.neutron_default_tenant_id = default {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.022333] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.use_forwarded_for = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.022508] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.use_neutron_default_nets = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.022685] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.022852] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.vendordata_dynamic_failure_fatal = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.023033] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.023218] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.vendordata_dynamic_ssl_certfile = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.023397] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.vendordata_dynamic_targets = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.023568] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.vendordata_jsonfile_path = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.023751] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api.vendordata_providers = ['StaticJSON'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.023946] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.backend = dogpile.cache.memcached {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.024131] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.backend_argument = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.024308] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.config_prefix = cache.oslo {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.024481] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.dead_timeout = 60.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.024650] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.debug_cache_backend = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.024816] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.enable_retry_client = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.024980] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.enable_socket_keepalive = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.025165] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.enabled = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.025336] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.expiration_time = 600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.025502] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.hashclient_retry_attempts = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.025672] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.hashclient_retry_delay = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.025835] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_dead_retry = 300 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.026011] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_password = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.026217] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.026391] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.026562] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_pool_maxsize = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.026729] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_pool_unused_timeout = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.026896] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_sasl_enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.027092] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_servers = ['localhost:11211'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.027267] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_socket_timeout = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.027496] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.memcache_username = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.027661] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.proxies = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.027831] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.retry_attempts = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.028010] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.retry_delay = 0.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.028184] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.socket_keepalive_count = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.028360] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.socket_keepalive_idle = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.028508] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.socket_keepalive_interval = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.028670] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.tls_allowed_ciphers = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.028830] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.tls_cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.028989] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.tls_certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.029169] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.tls_enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.029331] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cache.tls_keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.029536] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.auth_section = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.029726] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.auth_type = password {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.029894] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.030087] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.catalog_info = volumev3::publicURL {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.030255] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.030426] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.030592] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.cross_az_attach = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.030759] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.debug = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.030922] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.endpoint_template = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.031101] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.http_retries = 3 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.031272] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.031437] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.031614] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.os_region_name = RegionOne {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.031781] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.031944] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cinder.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.032133] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.032301] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.cpu_dedicated_set = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.032464] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.cpu_shared_set = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.032634] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.image_type_exclude_list = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.032800] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.live_migration_wait_for_vif_plug = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.032964] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.max_concurrent_disk_ops = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.033142] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.max_disk_devices_to_attach = -1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.033311] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.033486] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.033652] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.resource_provider_association_refresh = 300 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.033817] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.shutdown_retry_interval = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.033998] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.034194] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] conductor.workers = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.034376] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] console.allowed_origins = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.034543] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] console.ssl_ciphers = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.034717] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] console.ssl_minimum_version = default {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.034898] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] consoleauth.token_ttl = 600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.035085] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.035253] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.035423] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.035588] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.connect_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.035751] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.connect_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.035912] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.endpoint_override = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.036092] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.036260] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.036424] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.max_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.036587] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.min_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.036751] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.region_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.036910] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.service_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.037095] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.service_type = accelerator {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.037264] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.037440] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.status_code_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.037619] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.status_code_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.037781] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.037965] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.038145] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] cyborg.version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.038335] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.backend = sqlalchemy {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.038515] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.connection = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.038696] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.connection_debug = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.038863] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.connection_parameters = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.039039] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.connection_recycle_time = 3600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.039213] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.connection_trace = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.039381] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.db_inc_retry_interval = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.039595] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.db_max_retries = 20 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.039773] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.db_max_retry_interval = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.039941] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.db_retry_interval = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.040130] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.max_overflow = 50 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.040300] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.max_pool_size = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.040497] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.max_retries = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.040681] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.mysql_sql_mode = TRADITIONAL {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.040845] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.mysql_wsrep_sync_wait = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.041018] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.pool_timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.041471] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.retry_interval = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.041471] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.slave_connection = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.041534] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.sqlite_synchronous = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.041670] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] database.use_db_reconnect = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.041858] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.backend = sqlalchemy {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.042048] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.connection = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.042225] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.connection_debug = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.042398] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.connection_parameters = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.042568] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.connection_recycle_time = 3600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.042738] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.connection_trace = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.042903] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.db_inc_retry_interval = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.043080] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.db_max_retries = 20 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.043246] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.db_max_retry_interval = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.043412] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.db_retry_interval = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.043587] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.max_overflow = 50 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.043751] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.max_pool_size = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.043921] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.max_retries = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.044106] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.044271] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.mysql_wsrep_sync_wait = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.044435] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.pool_timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.044605] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.retry_interval = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.044766] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.slave_connection = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.044935] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] api_database.sqlite_synchronous = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.045122] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] devices.enabled_mdev_types = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.045313] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.045485] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ephemeral_storage_encryption.enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.045659] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ephemeral_storage_encryption.key_size = 512 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.045834] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.api_servers = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.046009] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.046185] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.046353] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.046518] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.connect_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.046681] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.connect_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.046851] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.debug = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.047027] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.default_trusted_certificate_ids = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.047199] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.enable_certificate_validation = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.047364] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.enable_rbd_download = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.047559] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.endpoint_override = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.047742] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.047910] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.048088] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.max_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.048253] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.min_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.048420] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.num_retries = 3 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.048622] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.rbd_ceph_conf = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.048812] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.rbd_connect_timeout = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.048990] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.rbd_pool = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.049178] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.rbd_user = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.049343] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.region_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.049552] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.service_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.049784] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.service_type = image {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.049968] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.050149] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.status_code_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.050316] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.status_code_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.050483] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.050667] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.050835] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.verify_glance_signatures = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.050999] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] glance.version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.051187] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] guestfs.debug = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.051362] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.config_drive_cdrom = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.051530] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.config_drive_inject_password = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.051706] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.051874] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.enable_instance_metrics_collection = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.052053] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.enable_remotefx = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.052231] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.instances_path_share = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.052401] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.iscsi_initiator_list = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.052598] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.limit_cpu_features = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.052773] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.052940] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.053119] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.power_state_check_timeframe = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.053295] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.power_state_event_polling_interval = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.053471] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.053640] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.use_multipath_io = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.053802] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.volume_attach_retry_count = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.053965] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.volume_attach_retry_interval = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.054138] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.vswitch_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.054306] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.054478] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] mks.enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.054852] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.055056] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] image_cache.manager_interval = 2400 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.055237] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] image_cache.precache_concurrency = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.055416] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] image_cache.remove_unused_base_images = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.055590] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.055762] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.055944] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] image_cache.subdirectory_name = _base {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.056144] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.api_max_retries = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.056317] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.api_retry_interval = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.056484] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.auth_section = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.056654] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.auth_type = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.056816] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.056977] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.057157] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.057326] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.conductor_group = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.057511] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.connect_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.057687] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.connect_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.057850] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.endpoint_override = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.058026] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.058196] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.058362] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.max_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.058523] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.min_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.058694] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.peer_list = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.058855] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.region_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.059050] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.serial_console_state_timeout = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.059195] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.service_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.059366] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.service_type = baremetal {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.059566] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.059735] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.status_code_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.059896] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.status_code_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.060069] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.060257] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.060444] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ironic.version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.060663] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.060846] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] key_manager.fixed_key = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.061045] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.061217] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.barbican_api_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.061379] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.barbican_endpoint = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.061568] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.barbican_endpoint_type = public {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.061758] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.barbican_region_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.061925] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.062100] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.062268] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.062432] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.062594] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.062758] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.number_of_retries = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.062922] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.retry_delay = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.063098] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.send_service_user_token = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.063268] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.063429] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.063594] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.verify_ssl = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.063754] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican.verify_ssl_path = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.063922] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican_service_user.auth_section = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.064101] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican_service_user.auth_type = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.064265] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican_service_user.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.064426] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican_service_user.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.064595] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican_service_user.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.064760] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican_service_user.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.064921] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican_service_user.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.065097] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican_service_user.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.065261] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] barbican_service_user.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.065430] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.approle_role_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.065594] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.approle_secret_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.065757] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.065917] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.066097] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.066264] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.066426] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.066601] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.kv_mountpoint = secret {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.066764] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.kv_path = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.066931] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.kv_version = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.067104] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.namespace = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.067269] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.root_token_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.067452] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.067630] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.ssl_ca_crt_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.067792] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.067956] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.use_ssl = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.068146] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.068325] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.auth_section = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.068497] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.auth_type = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.068663] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.068829] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.068996] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.069173] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.connect_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.069338] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.connect_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.069530] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.endpoint_override = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.069701] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.069864] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.070034] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.max_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.070199] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.min_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.070357] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.region_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.070542] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.service_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.070722] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.service_type = identity {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.070892] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.071067] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.status_code_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.071276] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.status_code_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.071396] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.071579] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.071748] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] keystone.version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.071955] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.connection_uri = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.072135] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.cpu_mode = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.072309] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.cpu_model_extra_flags = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.072489] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.cpu_models = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.072692] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.cpu_power_governor_high = performance {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.072872] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.cpu_power_governor_low = powersave {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.073052] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.cpu_power_management = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.073234] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.073404] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.device_detach_attempts = 8 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.073570] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.device_detach_timeout = 20 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.073768] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.disk_cachemodes = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.073939] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.disk_prefix = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.074123] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.enabled_perf_events = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.074295] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.file_backed_memory = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.074463] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.gid_maps = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.074628] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.hw_disk_discard = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.074787] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.hw_machine_type = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.074959] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.images_rbd_ceph_conf = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.075141] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.075316] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.075489] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.images_rbd_glance_store_name = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.075663] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.images_rbd_pool = rbd {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.075837] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.images_type = default {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.075998] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.images_volume_group = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.076180] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.inject_key = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.076343] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.inject_partition = -2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.076507] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.inject_password = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.076674] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.iscsi_iface = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.076838] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.iser_use_multipath = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.077008] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_bandwidth = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.077181] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_completion_timeout = 800 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.077349] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_downtime = 500 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.077583] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_downtime_delay = 75 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.077847] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_downtime_steps = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.078071] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_inbound_addr = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.078259] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_permit_auto_converge = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.078432] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_permit_post_copy = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.078600] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_scheme = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.078778] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_timeout_action = abort {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.078950] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_tunnelled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.079127] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_uri = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.079300] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.live_migration_with_native_tls = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.079484] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.max_queues = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.079661] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.mem_stats_period_seconds = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.079825] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.nfs_mount_options = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.080166] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.080345] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.num_aoe_discover_tries = 3 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.080533] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.num_iser_scan_tries = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.080704] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.num_memory_encrypted_guests = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.080872] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.num_nvme_discover_tries = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.081049] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.num_pcie_ports = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.081222] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.num_volume_scan_tries = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.081391] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.pmem_namespaces = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.081554] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.quobyte_client_cfg = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.081848] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.082034] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rbd_connect_timeout = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.082207] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.082375] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.082540] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rbd_secret_uuid = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.082704] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rbd_user = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.082867] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.realtime_scheduler_priority = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.083061] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.remote_filesystem_transport = ssh {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.083233] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rescue_image_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.083396] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rescue_kernel_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.083556] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rescue_ramdisk_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.083728] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rng_dev_path = /dev/urandom {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.083889] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.rx_queue_size = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.084068] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.smbfs_mount_options = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.084345] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.084532] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.snapshot_compression = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.084725] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.snapshot_image_format = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.084950] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.085138] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.sparse_logical_volumes = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.085309] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.swtpm_enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.085484] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.swtpm_group = tss {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.085671] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.swtpm_user = tss {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.085851] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.sysinfo_serial = unique {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.086020] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.tb_cache_size = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.086187] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.tx_queue_size = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.086351] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.uid_maps = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.086518] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.use_virtio_for_bridges = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.086694] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.virt_type = kvm {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.086864] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.volume_clear = zero {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.087039] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.volume_clear_size = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.087213] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.volume_use_multipath = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.087376] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.vzstorage_cache_path = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.087548] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.087725] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.vzstorage_mount_group = qemu {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.087895] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.vzstorage_mount_opts = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.088076] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.088364] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.088548] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.vzstorage_mount_user = stack {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.088722] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.088900] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.auth_section = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.089089] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.auth_type = password {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.089255] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.089436] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.089624] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.089799] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.connect_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.089968] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.connect_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.090158] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.default_floating_pool = public {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.090325] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.endpoint_override = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.090516] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.extension_sync_interval = 600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.090689] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.http_retries = 3 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.090858] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.091037] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.091209] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.max_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.091386] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.metadata_proxy_shared_secret = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.091550] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.min_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.091726] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.ovs_bridge = br-int {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.091894] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.physnets = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.092080] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.region_name = RegionOne {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.092257] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.service_metadata_proxy = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.092421] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.service_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.092595] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.service_type = network {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.092763] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.092922] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.status_code_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.093095] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.status_code_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.093259] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.093442] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.093610] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] neutron.version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.093786] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] notifications.bdms_in_notifications = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.093967] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] notifications.default_level = INFO {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.094159] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] notifications.notification_format = unversioned {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.094327] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] notifications.notify_on_state_change = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.094507] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.094687] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] pci.alias = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.094861] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] pci.device_spec = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.095041] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] pci.report_in_placement = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.095225] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.auth_section = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.095401] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.auth_type = password {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.095575] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.auth_url = http://10.180.1.21/identity {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.095739] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.095901] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.096079] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.096246] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.connect_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.096407] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.connect_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.096587] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.default_domain_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.096774] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.default_domain_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.096940] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.domain_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.097114] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.domain_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.097280] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.endpoint_override = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.097449] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.097615] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.097779] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.max_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.097939] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.min_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.098123] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.password = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.098290] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.project_domain_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.098461] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.project_domain_name = Default {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.098637] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.project_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.098812] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.project_name = service {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.098985] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.region_name = RegionOne {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.099168] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.service_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.099340] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.service_type = placement {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.099537] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.099712] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.status_code_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.099879] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.status_code_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.100053] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.system_scope = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.100217] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.100391] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.trust_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.100578] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.user_domain_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.100759] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.user_domain_name = Default {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.100925] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.user_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.101118] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.username = placement {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.101308] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.101475] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] placement.version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.101658] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.cores = 20 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.101827] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.count_usage_from_placement = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.102014] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.102192] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.injected_file_content_bytes = 10240 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.102365] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.injected_file_path_length = 255 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.102535] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.injected_files = 5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.102710] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.instances = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.102877] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.key_pairs = 100 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.103059] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.metadata_items = 128 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.103233] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.ram = 51200 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.103400] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.recheck_quota = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.103574] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.server_group_members = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.103741] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] quota.server_groups = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.103913] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] rdp.enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.104247] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.104438] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.104613] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.104782] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.image_metadata_prefilter = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.104948] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.105132] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.max_attempts = 3 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.105302] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.max_placement_results = 1000 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.105472] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.105642] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.query_placement_for_image_type_support = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.105805] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.105983] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] scheduler.workers = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.106172] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.106351] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.106536] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.106715] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.106883] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.107062] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.107236] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.107431] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.107604] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.host_subset_size = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.107777] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.107941] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.image_properties_default_architecture = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.108122] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.108296] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.isolated_hosts = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.108466] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.isolated_images = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.108635] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.max_instances_per_host = 50 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.108799] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.108965] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.109146] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.pci_in_placement = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.109318] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.109511] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.109697] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.109868] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.110048] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.110222] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.110423] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.track_instance_changes = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.110649] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.110833] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] metrics.required = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.111021] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] metrics.weight_multiplier = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.111190] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] metrics.weight_of_unavailable = -10000.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.111358] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] metrics.weight_setting = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.111660] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.111841] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] serial_console.enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.112032] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] serial_console.port_range = 10000:20000 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.112215] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.112389] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.112559] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] serial_console.serialproxy_port = 6083 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.112732] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.auth_section = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.112909] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.auth_type = password {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.113084] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.113249] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.113415] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.113581] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.113741] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.113928] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.send_service_user_token = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.114108] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.114270] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] service_user.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.114442] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.agent_enabled = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.114610] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.114902] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.115108] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.html5proxy_host = 0.0.0.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.115281] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.html5proxy_port = 6082 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.115445] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.image_compression = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.115612] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.jpeg_compression = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.115771] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.playback_compression = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.115943] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.server_listen = 127.0.0.1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.116128] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.116291] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.streaming_mode = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.116452] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] spice.zlib_compression = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.116623] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] upgrade_levels.baseapi = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.116784] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] upgrade_levels.cert = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.116956] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] upgrade_levels.compute = auto {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.117138] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] upgrade_levels.conductor = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.117298] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] upgrade_levels.scheduler = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.117487] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vendordata_dynamic_auth.auth_section = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.117660] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vendordata_dynamic_auth.auth_type = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.117824] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vendordata_dynamic_auth.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.117985] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vendordata_dynamic_auth.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.118167] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vendordata_dynamic_auth.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.118332] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vendordata_dynamic_auth.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.118496] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vendordata_dynamic_auth.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.118667] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vendordata_dynamic_auth.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.118821] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vendordata_dynamic_auth.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.118997] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.api_retry_count = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.119174] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.ca_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.119362] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.cache_prefix = devstack-image-cache {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.119568] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.cluster_name = testcl1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.119747] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.connection_pool_size = 10 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.119929] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.console_delay_seconds = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.120136] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.datastore_regex = ^datastore.* {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.120350] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.120527] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.host_password = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.120699] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.host_port = 443 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.120871] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.host_username = administrator@vsphere.local {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.121055] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.insecure = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.121224] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.integration_bridge = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.121393] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.maximum_objects = 100 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.121555] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.pbm_default_policy = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.121724] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.pbm_enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.121885] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.pbm_wsdl_location = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.122066] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.122232] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.serial_port_proxy_uri = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.122392] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.serial_port_service_uri = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.122562] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.task_poll_interval = 0.5 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.122737] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.use_linked_clone = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.123210] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.vnc_keymap = en-us {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.123210] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.vnc_port = 5900 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.123274] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vmware.vnc_port_total = 10000 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.123419] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.auth_schemes = ['none'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.123596] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.123889] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.124090] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.124268] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.novncproxy_port = 6080 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.124446] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.server_listen = 127.0.0.1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.124625] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.124787] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.vencrypt_ca_certs = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.124947] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.vencrypt_client_cert = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.125120] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vnc.vencrypt_client_key = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.125299] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.125465] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.disable_deep_image_inspection = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.125631] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.disable_fallback_pcpu_query = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.125793] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.disable_group_policy_check_upcall = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.125954] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.126133] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.disable_rootwrap = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.126297] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.enable_numa_live_migration = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.126459] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.126622] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.126783] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.handle_virt_lifecycle_events = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.126945] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.libvirt_disable_apic = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.127124] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.never_download_image_if_on_rbd = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.127292] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.127477] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.127648] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.127812] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.127979] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.128159] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.128323] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.128488] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.128657] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.128844] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.129026] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.client_socket_timeout = 900 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.129200] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.default_pool_size = 1000 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.129373] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.keep_alive = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.129571] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.max_header_line = 16384 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.129743] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.secure_proxy_ssl_header = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.129909] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.ssl_ca_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.130087] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.ssl_cert_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.130256] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.ssl_key_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.130429] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.tcp_keepidle = 600 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.130607] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.130777] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] zvm.ca_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.130940] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] zvm.cloud_connector_url = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.131242] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.131422] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] zvm.reachable_timeout = 300 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.131644] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.enforce_new_defaults = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.131827] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.enforce_scope = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.132015] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.policy_default_rule = default {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.132206] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.132387] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.policy_file = policy.yaml {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.132563] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.132733] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.132894] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.133065] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.133234] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.133406] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.133587] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.133766] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.connection_string = messaging:// {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.133937] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.enabled = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.134122] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.es_doc_type = notification {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.134289] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.es_scroll_size = 10000 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.134459] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.es_scroll_time = 2m {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.134627] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.filter_error_trace = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.134798] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.hmac_keys = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.134963] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.sentinel_service_name = mymaster {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.135143] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.socket_timeout = 0.1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.135308] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.trace_requests = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.135469] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler.trace_sqlalchemy = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.135658] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler_jaeger.process_tags = {} {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.135824] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler_jaeger.service_name_prefix = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.135991] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] profiler_otlp.service_name_prefix = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.136173] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] remote_debug.host = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.136333] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] remote_debug.port = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.136514] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.136680] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.136844] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.137027] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.137188] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.137371] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.137546] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.137715] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.137877] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.138050] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.138228] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.138398] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.138576] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.138752] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.138921] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.139111] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.139279] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.139466] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.139655] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.139826] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.139989] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.140176] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.140341] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.140504] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.140673] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.140845] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.ssl = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.141026] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.141204] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.141369] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.141541] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.141716] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_rabbit.ssl_version = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.141906] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.142088] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_notifications.retry = -1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.142275] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.142453] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_messaging_notifications.transport_url = **** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.142634] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.auth_section = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.142804] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.auth_type = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.142964] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.cafile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.143146] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.certfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.143314] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.collect_timing = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.143477] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.connect_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.143640] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.connect_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.143802] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.endpoint_id = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.143961] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.endpoint_override = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.144137] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.insecure = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.144300] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.keyfile = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.144460] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.max_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.144619] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.min_version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.144776] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.region_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.144932] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.service_name = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.145102] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.service_type = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.145269] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.split_loggers = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.145428] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.status_code_retries = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.145591] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.status_code_retry_delay = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.145753] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.timeout = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.145913] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.valid_interfaces = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.146084] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_limit.version = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.146255] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_reports.file_event_handler = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.146425] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_reports.file_event_handler_interval = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.146589] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] oslo_reports.log_dir = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.146763] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.146926] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_linux_bridge_privileged.group = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.147100] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.147277] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.147480] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.147654] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_linux_bridge_privileged.user = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.147831] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.147996] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_ovs_privileged.group = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.148176] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_ovs_privileged.helper_command = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.148346] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.148512] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.148676] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] vif_plug_ovs_privileged.user = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.148847] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_linux_bridge.flat_interface = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.149040] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.149223] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.149422] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.149595] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.149775] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.149944] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.150126] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_linux_bridge.vlan_interface = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.150321] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.150495] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_ovs.isolate_vif = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.150666] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.150834] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.151010] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.151192] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_ovs.ovsdb_interface = native {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.151359] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_vif_ovs.per_port_bridge = False {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.151528] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_brick.lock_path = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.151698] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.151865] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] os_brick.wait_mpath_device_interval = 1 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.152048] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] privsep_osbrick.capabilities = [21] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.152217] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] privsep_osbrick.group = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.152379] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] privsep_osbrick.helper_command = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.152547] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.152714] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] privsep_osbrick.thread_pool_size = 8 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.152877] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] privsep_osbrick.user = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.153062] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.153228] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] nova_sys_admin.group = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.153391] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] nova_sys_admin.helper_command = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.153560] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.153726] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] nova_sys_admin.thread_pool_size = 8 {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.153888] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] nova_sys_admin.user = None {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 666.154030] env[67977]: DEBUG oslo_service.service [None req-58574229-6676-4524-98f8-9066d8f1487f None None] ******************************************************************************** {{(pid=67977) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 666.154457] env[67977]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 666.163962] env[67977]: WARNING nova.virt.vmwareapi.driver [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 666.164417] env[67977]: INFO nova.virt.node [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Generated node identity cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 [ 666.164644] env[67977]: INFO nova.virt.node [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Wrote node identity cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 to /opt/stack/data/n-cpu-1/compute_id [ 666.177112] env[67977]: WARNING nova.compute.manager [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Compute nodes ['cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 666.209498] env[67977]: INFO nova.compute.manager [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 666.230785] env[67977]: WARNING nova.compute.manager [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 666.231018] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.231238] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.231403] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.231552] env[67977]: DEBUG nova.compute.resource_tracker [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 666.232779] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d535af08-6456-4cc2-9245-4888a975b9af {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.241740] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2211df3a-6b85-4868-a4e9-1c21e8b1ca48 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.255413] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c09183e6-adf6-42cb-a46e-628db7823f62 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.261679] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb216b9a-1941-415d-b4f0-fd9a7acf7055 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.290962] env[67977]: DEBUG nova.compute.resource_tracker [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180914MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 666.291085] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.291262] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.303447] env[67977]: WARNING nova.compute.resource_tracker [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] No compute node record for cpu-1:cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 could not be found. [ 666.315300] env[67977]: INFO nova.compute.resource_tracker [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 [ 666.367982] env[67977]: DEBUG nova.compute.resource_tracker [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 666.367982] env[67977]: DEBUG nova.compute.resource_tracker [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 666.471534] env[67977]: INFO nova.scheduler.client.report [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] [req-7db96ee1-050a-45ef-a64c-1172bd9f696d] Created resource provider record via placement API for resource provider with UUID cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 666.489200] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a833ff0-14a9-4aae-ac45-b87e3a0eda10 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.497944] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc8f0ced-b80b-44c1-b7ef-fd13a6c475e1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.528075] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f34afeb-c1f4-465a-a33a-648e000de07f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.535402] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba196d16-3d7c-4d9c-90e5-bb89fdaa6721 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.548591] env[67977]: DEBUG nova.compute.provider_tree [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Updating inventory in ProviderTree for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 666.587085] env[67977]: DEBUG nova.scheduler.client.report [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Updated inventory for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 666.587349] env[67977]: DEBUG nova.compute.provider_tree [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Updating resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 generation from 0 to 1 during operation: update_inventory {{(pid=67977) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 666.587535] env[67977]: DEBUG nova.compute.provider_tree [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Updating inventory in ProviderTree for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 666.633584] env[67977]: DEBUG nova.compute.provider_tree [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Updating resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 generation from 1 to 2 during operation: update_traits {{(pid=67977) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 666.651037] env[67977]: DEBUG nova.compute.resource_tracker [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 666.651226] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.360s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.651392] env[67977]: DEBUG nova.service [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Creating RPC server for service compute {{(pid=67977) start /opt/stack/nova/nova/service.py:182}} [ 666.666449] env[67977]: DEBUG nova.service [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] Join ServiceGroup membership for this service compute {{(pid=67977) start /opt/stack/nova/nova/service.py:199}} [ 666.666642] env[67977]: DEBUG nova.servicegroup.drivers.db [None req-9a7adf0d-b618-4f5d-aef7-d5c1c570813a None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=67977) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 675.987833] env[67977]: DEBUG dbcounter [-] [67977] Writing DB stats nova_cell0:SELECT=1 {{(pid=67977) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 675.991461] env[67977]: DEBUG dbcounter [-] [67977] Writing DB stats nova_cell1:SELECT=1 {{(pid=67977) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 698.668741] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 698.682157] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Getting list of instances from cluster (obj){ [ 698.682157] env[67977]: value = "domain-c8" [ 698.682157] env[67977]: _type = "ClusterComputeResource" [ 698.682157] env[67977]: } {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 698.683239] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51cf5a9b-dac4-4db2-b41d-49aed6c94646 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.692609] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Got total of 0 instances {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 698.692845] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 698.693193] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Getting list of instances from cluster (obj){ [ 698.693193] env[67977]: value = "domain-c8" [ 698.693193] env[67977]: _type = "ClusterComputeResource" [ 698.693193] env[67977]: } {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 698.694098] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-900c8bb5-beb1-4cf0-9330-0d20107dd5be {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.701726] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Got total of 0 instances {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 710.369910] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquiring lock "7900e978-def6-4636-a2cb-94c322a23d15" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 710.370249] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Lock "7900e978-def6-4636-a2cb-94c322a23d15" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 710.392265] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 710.521963] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 710.522263] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 710.523863] env[67977]: INFO nova.compute.claims [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 710.703370] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-040c2929-65e8-4cc9-9f6a-53e31b16e254 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.716331] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55949e09-653e-40e4-900a-1b4ba3a9c0ae {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.755045] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c864b4e-2a72-4e61-b889-bc4c2bab9076 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.763192] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5617b314-20af-4e6e-b06e-c6f5fd311d50 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.781601] env[67977]: DEBUG nova.compute.provider_tree [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 710.796497] env[67977]: DEBUG nova.scheduler.client.report [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 710.813956] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 710.814140] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 710.881291] env[67977]: DEBUG nova.compute.utils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 710.882128] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Not allocating networking since 'none' was specified. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 710.905681] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 711.042137] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 711.923044] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 711.923444] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 711.923809] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 711.923869] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 711.924129] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 711.924410] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 711.924760] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 711.925350] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 711.925530] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 711.925826] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 711.926259] env[67977]: DEBUG nova.virt.hardware [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 711.929123] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58e33684-07f2-4529-8848-9f65a8d28d3f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.940483] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b39355b-307c-42ba-84ba-3a792295ee66 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.967374] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36ad2778-abe6-40db-b106-a71f9305dc6c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.987289] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Instance VIF info [] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 711.997868] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 711.997868] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d742de03-8a56-431f-b15c-64f3637733ab {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.012106] env[67977]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 712.012583] env[67977]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=67977) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 712.013024] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Folder already exists: OpenStack. Parent ref: group-v4. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 712.013813] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Creating folder: Project (a2d9c0bc08a54480a20ab2c770300317). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 712.013813] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7df60bf3-8a44-4898-8c2e-9ba89a2de206 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.025147] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Created folder: Project (a2d9c0bc08a54480a20ab2c770300317) in parent group-v693022. [ 712.025147] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Creating folder: Instances. Parent ref: group-v693026. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 712.025147] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-563bbe2c-642a-43bc-b97a-20676d31e167 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.035701] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Created folder: Instances in parent group-v693026. [ 712.035986] env[67977]: DEBUG oslo.service.loopingcall [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 712.036525] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 712.036725] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e8306902-f08a-47f0-a69c-97a9f4ab6bda {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.055468] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 712.055468] env[67977]: value = "task-3468086" [ 712.055468] env[67977]: _type = "Task" [ 712.055468] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 712.070109] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468086, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 712.576084] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquiring lock "04e59d76-a2d5-482c-90a0-fcb407c0bd4e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 712.576325] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Lock "04e59d76-a2d5-482c-90a0-fcb407c0bd4e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.577375] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468086, 'name': CreateVM_Task} progress is 25%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 712.591737] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 712.601744] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquiring lock "f6e698af-6d7e-40d5-988b-450f300b67a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 712.602025] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "f6e698af-6d7e-40d5-988b-450f300b67a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.623600] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 712.715865] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 712.716159] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.718767] env[67977]: INFO nova.compute.claims [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 712.723288] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 712.866208] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb1dc5b5-8f02-4806-8217-795a7d1f4754 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.874358] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae445ccd-ac62-4845-9a76-3e8029609e58 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.911660] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17d13023-793d-404c-b44c-d040c6e02de7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.923182] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eef854a-99c2-4e86-9d9f-62405650741b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.938891] env[67977]: DEBUG nova.compute.provider_tree [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 712.953363] env[67977]: DEBUG nova.scheduler.client.report [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 712.969882] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.254s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 712.970406] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 712.973842] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.250s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.975156] env[67977]: INFO nova.compute.claims [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 713.037568] env[67977]: DEBUG nova.compute.utils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 713.038569] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 713.038807] env[67977]: DEBUG nova.network.neutron [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 713.067524] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 713.078297] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468086, 'name': CreateVM_Task} progress is 25%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 713.132417] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e6a2895-ff4a-4f40-9056-d42ccb24a792 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.141783] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c90b934d-9d9c-45fc-bd1e-474a36f3b4a4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.178895] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 713.181749] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1059c33e-7022-4791-a1a6-c2261d7c1d88 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.193026] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7dbd5e2-a25c-4606-9547-f10f1333295f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.207805] env[67977]: DEBUG nova.compute.provider_tree [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.228896] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 713.229056] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 713.229099] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 713.229255] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 713.229395] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 713.229537] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 713.229742] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 713.229900] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 713.230062] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 713.230339] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 713.230442] env[67977]: DEBUG nova.virt.hardware [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 713.231348] env[67977]: DEBUG nova.scheduler.client.report [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.234884] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01a802cd-2a7b-424c-83af-a756fdb0991e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.247294] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-768cab07-0e02-4565-bed7-6fa2c239f599 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.263451] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.290s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 713.264011] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 713.310579] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "b623a2f1-404e-4f48-aeb2-ebb372260a86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 713.310831] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "b623a2f1-404e-4f48-aeb2-ebb372260a86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 713.323250] env[67977]: DEBUG nova.compute.utils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 713.324552] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 713.324791] env[67977]: DEBUG nova.network.neutron [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 713.328275] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 713.337151] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 713.414462] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 713.414462] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 713.416146] env[67977]: INFO nova.compute.claims [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 713.449290] env[67977]: DEBUG nova.policy [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f11a6a61116486d8dcd5c09b5b39a53', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ac48fabcd394452844d1d65c0a22c4e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 713.452384] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 713.482448] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 713.482768] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 713.483336] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 713.483336] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 713.483639] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 713.483888] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 713.484478] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 713.484478] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 713.485045] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 713.485382] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 713.485680] env[67977]: DEBUG nova.virt.hardware [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 713.487107] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e2a39f7-6840-4919-b2ee-2c58c678d9e6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.496525] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6329299f-61ff-466a-892f-2f4cd35bbff9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.582507] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468086, 'name': CreateVM_Task} progress is 25%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 713.643182] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-223abe30-057f-47a2-ae3b-0cd8f39d7f73 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.652470] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cb50d4a-8737-4ba4-968f-e28e476be9cf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.700633] env[67977]: DEBUG nova.policy [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f97c4e013c7b405e8fa5fd62586bdd06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4e4ed49c2670412eb72ce0e7c9cf9ec8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 713.703336] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d453d06-0ab4-4895-b9ce-f72ec289eee0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.711171] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc1ef09f-6155-4296-ba2e-ca91c1838329 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.725369] env[67977]: DEBUG nova.compute.provider_tree [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.744294] env[67977]: DEBUG nova.scheduler.client.report [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.765404] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 713.765989] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 713.824234] env[67977]: DEBUG nova.compute.utils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 713.826374] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 713.826723] env[67977]: DEBUG nova.network.neutron [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 713.838364] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 713.921867] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 713.950999] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 713.951363] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 713.951496] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 713.951678] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 713.951930] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 713.952019] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 713.952193] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 713.952352] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 713.952523] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 713.952684] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 713.952853] env[67977]: DEBUG nova.virt.hardware [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 713.953800] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4bf85c3-3237-401b-a52e-39875143aca9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.964091] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2b068d6-8b72-4bfc-9e82-ad62ed0b391d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.074721] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468086, 'name': CreateVM_Task, 'duration_secs': 1.701583} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 714.075317] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 714.076126] env[67977]: DEBUG oslo_vmware.service [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-218dad1b-c7f6-4533-b8fa-f0470ed989c0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.084235] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 714.084478] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 714.085209] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 714.085499] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e6b3ddcf-23ef-41bf-9c29-51b55123b91e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.092013] env[67977]: DEBUG oslo_vmware.api [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Waiting for the task: (returnval){ [ 714.092013] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ae37cd-b7e9-5d48-1f9f-704ad20b7c5f" [ 714.092013] env[67977]: _type = "Task" [ 714.092013] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 714.100286] env[67977]: DEBUG oslo_vmware.api [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ae37cd-b7e9-5d48-1f9f-704ad20b7c5f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 714.519084] env[67977]: DEBUG nova.policy [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9f70eb89c04248fb857162c0deabeda2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '42c61dc228d94e8bb14985347d4f811a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 714.604946] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 714.605417] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 714.605755] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 714.605755] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 714.606151] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 714.606419] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a8875914-4302-4558-9e82-476967ec1395 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.616515] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 714.616789] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 714.617838] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a6d45f2-f60a-406b-ac5f-a34e4436fa27 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.626799] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-74587bcd-d997-4cb1-8e22-62475c05f0eb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.632272] env[67977]: DEBUG oslo_vmware.api [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Waiting for the task: (returnval){ [ 714.632272] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52601f64-b4f8-72d0-9a1c-bc940885a065" [ 714.632272] env[67977]: _type = "Task" [ 714.632272] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 714.640726] env[67977]: DEBUG oslo_vmware.api [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52601f64-b4f8-72d0-9a1c-bc940885a065, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 715.151882] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 715.151882] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Creating directory with path [datastore1] vmware_temp/41c4bdce-1831-4291-a1fd-7e3acdc6075c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 715.152638] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4c2ba659-e444-462b-9cfb-56904edb666b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.185280] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Created directory with path [datastore1] vmware_temp/41c4bdce-1831-4291-a1fd-7e3acdc6075c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 715.185280] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Fetch image to [datastore1] vmware_temp/41c4bdce-1831-4291-a1fd-7e3acdc6075c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 715.185280] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/41c4bdce-1831-4291-a1fd-7e3acdc6075c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 715.186491] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-117e25aa-5ac3-42b0-85ce-db6ca72109ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.196940] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffe67c4c-456d-4fe7-89cc-6bbb719cf969 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.210571] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef3b454f-a0df-49aa-a647-9d1580aa1482 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.248617] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-695cfa14-fac8-48d4-8921-a725b39fc203 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.256268] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-319e0303-0c1b-4eac-9deb-dee29b4e0acb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.288791] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 715.361138] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/41c4bdce-1831-4291-a1fd-7e3acdc6075c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 715.466345] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 715.466626] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/41c4bdce-1831-4291-a1fd-7e3acdc6075c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 715.586469] env[67977]: DEBUG nova.network.neutron [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Successfully created port: 1653050f-2f2d-488c-b821-669397edcf6e {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 716.110594] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquiring lock "faf24c4e-135e-47df-85a6-05024bc9b64b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 716.111523] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "faf24c4e-135e-47df-85a6-05024bc9b64b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 716.133392] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 716.195296] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 716.195830] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 716.197121] env[67977]: INFO nova.compute.claims [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 716.321761] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquiring lock "02dea9f7-00be-4305-909c-ab9245b60e1d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 716.321995] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "02dea9f7-00be-4305-909c-ab9245b60e1d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 716.334675] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 716.393016] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfc48232-708c-4a7f-866d-432427fd1d2b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.409087] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d68cf963-a241-45be-b147-9a37e730e837 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.416868] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 716.442639] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e9bf08e-406c-4e57-ac4c-fc0128f5ebdf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.449935] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c1d97c6-d66d-4a3f-848f-d1759864bb1b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.467413] env[67977]: DEBUG nova.compute.provider_tree [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.469392] env[67977]: DEBUG nova.network.neutron [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Successfully created port: 6b969739-8f65-4d46-90c2-1e4f0b80f466 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 716.476593] env[67977]: DEBUG nova.scheduler.client.report [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.496217] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 716.496629] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 716.499186] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.082s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 716.500412] env[67977]: INFO nova.compute.claims [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 716.543330] env[67977]: DEBUG nova.compute.utils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 716.545232] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 716.545507] env[67977]: DEBUG nova.network.neutron [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 716.560327] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 716.664798] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 716.674554] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-938df2e5-c458-4305-a80a-6735a1d10876 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.684963] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72b55f68-ebd1-459c-8c4c-f99c836e23a6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.719877] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 716.720164] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 716.720323] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 716.720500] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 716.720642] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 716.720784] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 716.720988] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 716.721158] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 716.721318] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 716.721480] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 716.721686] env[67977]: DEBUG nova.virt.hardware [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 716.722714] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ce53847-2a73-4c28-bfe4-90d8fed2d31e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.726303] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2106a359-d844-4372-8dd1-c548373dee1a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.734735] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d3b30fe-6961-485f-ae89-dd8ff37a6150 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.757482] env[67977]: DEBUG nova.compute.provider_tree [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.760031] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c89ea0cb-4d6e-4093-aa1c-dfd0ba64b020 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.774575] env[67977]: DEBUG nova.scheduler.client.report [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.795712] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.297s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 716.796316] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 716.849740] env[67977]: DEBUG nova.compute.utils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 716.852204] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 716.852438] env[67977]: DEBUG nova.network.neutron [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 716.865463] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 716.971830] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 717.010707] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 717.010958] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 717.011171] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 717.011382] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 717.011529] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 717.011671] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 717.012135] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 717.012326] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 717.012970] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 717.013177] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 717.013361] env[67977]: DEBUG nova.virt.hardware [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 717.014582] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47f68f74-9e7c-434c-92e0-3695d9e73a8c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.026627] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0edb338c-b3d2-4f1a-8032-fb2b13cf201d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.080749] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquiring lock "fece5f33-93ed-4202-8cd0-637924929ee4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 717.080981] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Lock "fece5f33-93ed-4202-8cd0-637924929ee4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 717.096044] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 717.177533] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 717.177826] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 717.179402] env[67977]: INFO nova.compute.claims [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 717.263427] env[67977]: DEBUG nova.network.neutron [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Successfully created port: ca351fe7-626f-4a6b-a218-ff68c1ea7f99 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 717.320929] env[67977]: DEBUG nova.policy [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7c86835db7a64e8aa5c25344920d63ac', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1cb0c68cec744095a75c950deb5bf143', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 717.426029] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca2fd451-7341-472a-b0c7-5c63ef36a4f5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.435115] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a2ad7c8-9d27-4736-afc1-1e7e463ba8ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.469428] env[67977]: DEBUG nova.policy [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '86f9d0e1413f4092ac3f9296f432bb2a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '31f0a47096fd4368b4fe144bc2102e7a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 717.471559] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b39e592-92b0-4a97-a4cd-07233580b8ac {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.479456] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06122f2c-1092-422f-9a55-12361d3a8ca5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.495638] env[67977]: DEBUG nova.compute.provider_tree [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 717.512586] env[67977]: DEBUG nova.scheduler.client.report [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 717.535660] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.358s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 717.536180] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 717.584800] env[67977]: DEBUG nova.compute.utils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 717.586099] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Not allocating networking since 'none' was specified. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 717.600378] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 717.702275] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 717.741933] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 717.742102] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 717.742332] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 717.742413] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 717.742553] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 717.742737] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 717.742965] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 717.745566] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 717.745566] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 717.745738] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 717.745966] env[67977]: DEBUG nova.virt.hardware [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 717.746887] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b514c8be-7ddd-457e-b7a5-0bf183ca524b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.757297] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0f10713-c319-47dd-9b3d-54a48e73b02b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.777716] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Instance VIF info [] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 717.788524] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Creating folder: Project (e26e96a127d24bc8b61811a8411fb497). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 717.789136] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-57e45a05-a4b5-4c9b-8c75-99edd5d39c48 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.799961] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Created folder: Project (e26e96a127d24bc8b61811a8411fb497) in parent group-v693022. [ 717.800664] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Creating folder: Instances. Parent ref: group-v693029. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 717.801032] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cf01dbfc-b69c-4295-8bae-93a1d5673e76 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.813223] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Created folder: Instances in parent group-v693029. [ 717.813508] env[67977]: DEBUG oslo.service.loopingcall [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 717.813766] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 717.813931] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-771fdc7c-428d-47c2-8d6e-878dcb315846 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.838983] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 717.838983] env[67977]: value = "task-3468091" [ 717.838983] env[67977]: _type = "Task" [ 717.838983] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 717.849574] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468091, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 718.353298] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468091, 'name': CreateVM_Task} progress is 99%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 718.850657] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468091, 'name': CreateVM_Task} progress is 99%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 719.354100] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468091, 'name': CreateVM_Task, 'duration_secs': 1.376821} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 719.354100] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 719.354342] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 719.356635] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 719.356635] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 719.356635] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a958533b-be64-4197-84e6-ccdcf5e6edd8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.363624] env[67977]: DEBUG oslo_vmware.api [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Waiting for the task: (returnval){ [ 719.363624] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]522dcf36-5625-836f-6181-338344e2ffe6" [ 719.363624] env[67977]: _type = "Task" [ 719.363624] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 719.371995] env[67977]: DEBUG oslo_vmware.api [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]522dcf36-5625-836f-6181-338344e2ffe6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 719.879882] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 719.879882] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 719.879882] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 720.402874] env[67977]: DEBUG nova.network.neutron [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Successfully created port: cf86095f-6af5-4123-b1f6-c21363bcc825 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 720.694283] env[67977]: DEBUG nova.network.neutron [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Successfully updated port: 6b969739-8f65-4d46-90c2-1e4f0b80f466 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 720.713063] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquiring lock "refresh_cache-04e59d76-a2d5-482c-90a0-fcb407c0bd4e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 720.713211] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquired lock "refresh_cache-04e59d76-a2d5-482c-90a0-fcb407c0bd4e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 720.713380] env[67977]: DEBUG nova.network.neutron [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 720.963280] env[67977]: DEBUG nova.network.neutron [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 720.978483] env[67977]: DEBUG nova.network.neutron [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Successfully created port: c80b1c1b-4792-4b89-ad1d-bc34ebb6428d {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 721.257278] env[67977]: DEBUG nova.compute.manager [req-d4a7b6fd-3858-4312-bc02-180576d04f44 req-2862533c-6a3e-4071-bb59-6a12c7c5f08f service nova] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Received event network-vif-plugged-6b969739-8f65-4d46-90c2-1e4f0b80f466 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 721.257278] env[67977]: DEBUG oslo_concurrency.lockutils [req-d4a7b6fd-3858-4312-bc02-180576d04f44 req-2862533c-6a3e-4071-bb59-6a12c7c5f08f service nova] Acquiring lock "04e59d76-a2d5-482c-90a0-fcb407c0bd4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 721.257278] env[67977]: DEBUG oslo_concurrency.lockutils [req-d4a7b6fd-3858-4312-bc02-180576d04f44 req-2862533c-6a3e-4071-bb59-6a12c7c5f08f service nova] Lock "04e59d76-a2d5-482c-90a0-fcb407c0bd4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 721.257278] env[67977]: DEBUG oslo_concurrency.lockutils [req-d4a7b6fd-3858-4312-bc02-180576d04f44 req-2862533c-6a3e-4071-bb59-6a12c7c5f08f service nova] Lock "04e59d76-a2d5-482c-90a0-fcb407c0bd4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 721.258510] env[67977]: DEBUG nova.compute.manager [req-d4a7b6fd-3858-4312-bc02-180576d04f44 req-2862533c-6a3e-4071-bb59-6a12c7c5f08f service nova] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] No waiting events found dispatching network-vif-plugged-6b969739-8f65-4d46-90c2-1e4f0b80f466 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 721.258845] env[67977]: WARNING nova.compute.manager [req-d4a7b6fd-3858-4312-bc02-180576d04f44 req-2862533c-6a3e-4071-bb59-6a12c7c5f08f service nova] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Received unexpected event network-vif-plugged-6b969739-8f65-4d46-90c2-1e4f0b80f466 for instance with vm_state building and task_state spawning. [ 721.569758] env[67977]: DEBUG nova.network.neutron [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Successfully updated port: ca351fe7-626f-4a6b-a218-ff68c1ea7f99 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 721.590152] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "refresh_cache-b623a2f1-404e-4f48-aeb2-ebb372260a86" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 721.590329] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquired lock "refresh_cache-b623a2f1-404e-4f48-aeb2-ebb372260a86" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 721.590487] env[67977]: DEBUG nova.network.neutron [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 721.767606] env[67977]: DEBUG nova.network.neutron [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.180451] env[67977]: DEBUG nova.network.neutron [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Updating instance_info_cache with network_info: [{"id": "6b969739-8f65-4d46-90c2-1e4f0b80f466", "address": "fa:16:3e:33:32:f0", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.249", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b969739-8f", "ovs_interfaceid": "6b969739-8f65-4d46-90c2-1e4f0b80f466", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.214012] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Releasing lock "refresh_cache-04e59d76-a2d5-482c-90a0-fcb407c0bd4e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 722.214335] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Instance network_info: |[{"id": "6b969739-8f65-4d46-90c2-1e4f0b80f466", "address": "fa:16:3e:33:32:f0", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.249", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b969739-8f", "ovs_interfaceid": "6b969739-8f65-4d46-90c2-1e4f0b80f466", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 722.214816] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:33:32:f0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dbd2870d-a51d-472a-8034-1b3e132b5cb6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6b969739-8f65-4d46-90c2-1e4f0b80f466', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 722.226953] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Creating folder: Project (4e4ed49c2670412eb72ce0e7c9cf9ec8). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 722.227620] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0708ac65-1561-48cb-af3a-fc8646f6d76e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.239334] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Created folder: Project (4e4ed49c2670412eb72ce0e7c9cf9ec8) in parent group-v693022. [ 722.239653] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Creating folder: Instances. Parent ref: group-v693032. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 722.239784] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-adb24137-ddd9-4000-952b-9dc17d05fcaf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.254095] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Created folder: Instances in parent group-v693032. [ 722.254338] env[67977]: DEBUG oslo.service.loopingcall [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 722.254924] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 722.254924] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9d581dca-19d4-48aa-9a64-83070c650df2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.277319] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 722.277319] env[67977]: value = "task-3468095" [ 722.277319] env[67977]: _type = "Task" [ 722.277319] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 722.290130] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468095, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 722.406889] env[67977]: DEBUG nova.network.neutron [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Updating instance_info_cache with network_info: [{"id": "ca351fe7-626f-4a6b-a218-ff68c1ea7f99", "address": "fa:16:3e:f8:40:ca", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.136", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapca351fe7-62", "ovs_interfaceid": "ca351fe7-626f-4a6b-a218-ff68c1ea7f99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.422854] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Releasing lock "refresh_cache-b623a2f1-404e-4f48-aeb2-ebb372260a86" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 722.423226] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Instance network_info: |[{"id": "ca351fe7-626f-4a6b-a218-ff68c1ea7f99", "address": "fa:16:3e:f8:40:ca", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.136", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapca351fe7-62", "ovs_interfaceid": "ca351fe7-626f-4a6b-a218-ff68c1ea7f99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 722.423646] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f8:40:ca', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dbd2870d-a51d-472a-8034-1b3e132b5cb6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ca351fe7-626f-4a6b-a218-ff68c1ea7f99', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 722.434975] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Creating folder: Project (42c61dc228d94e8bb14985347d4f811a). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 722.434975] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cf82fd9a-91c4-4331-b6ba-8e85aea5a851 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.444786] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Created folder: Project (42c61dc228d94e8bb14985347d4f811a) in parent group-v693022. [ 722.445208] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Creating folder: Instances. Parent ref: group-v693035. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 722.445555] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9aadb668-b95d-437c-bf43-5893d9b89460 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.458854] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Created folder: Instances in parent group-v693035. [ 722.458854] env[67977]: DEBUG oslo.service.loopingcall [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 722.459098] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 722.459293] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7d44a612-1791-4ea2-999f-2b66983bf0ef {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.486210] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 722.486210] env[67977]: value = "task-3468099" [ 722.486210] env[67977]: _type = "Task" [ 722.486210] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 722.500180] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468099, 'name': CreateVM_Task} progress is 5%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 722.784239] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 722.789509] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 722.790103] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 722.790103] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 722.791222] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468095, 'name': CreateVM_Task, 'duration_secs': 0.337921} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 722.791575] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 722.816246] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 722.816751] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 722.817062] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 722.817551] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 722.817713] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 722.817844] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 722.817966] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 722.818596] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 722.821166] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 722.821166] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 722.821166] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 722.821542] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 722.822346] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 722.822521] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 722.822579] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 722.823313] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 722.828999] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 722.829557] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 722.829557] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 722.830392] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7cb0749b-33d6-4547-96c5-41f794d49415 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.838191] env[67977]: DEBUG oslo_vmware.api [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Waiting for the task: (returnval){ [ 722.838191] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52926473-943b-9fce-0636-f65003f0a636" [ 722.838191] env[67977]: _type = "Task" [ 722.838191] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 722.845017] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 722.845407] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 722.846422] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 722.846422] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 722.848971] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41a529b0-2e8b-4069-aea2-7bee50db1a41 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.858697] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 722.858697] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 722.858697] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 722.868044] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e25be3dd-a147-4cae-87e7-53035fe8b68a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.887595] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ebf2e9a-3e88-4aa9-aaac-68a194b3d2f4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.897467] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-297681d4-5804-4cb4-8840-3873ad4f97a3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.936945] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180914MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 722.937462] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 722.937828] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 723.006999] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468099, 'name': CreateVM_Task} progress is 99%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 723.105427] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 7900e978-def6-4636-a2cb-94c322a23d15 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 723.106341] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f6e698af-6d7e-40d5-988b-450f300b67a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 723.106341] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 04e59d76-a2d5-482c-90a0-fcb407c0bd4e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 723.106341] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b623a2f1-404e-4f48-aeb2-ebb372260a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 723.106341] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance faf24c4e-135e-47df-85a6-05024bc9b64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 723.106532] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 02dea9f7-00be-4305-909c-ab9245b60e1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 723.106532] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance fece5f33-93ed-4202-8cd0-637924929ee4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 723.106532] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 723.106631] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 723.274427] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d25505ea-e289-4089-9897-3dd379d476a0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.287579] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b951df0-660c-46a4-a94b-a2c7e6525788 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.333033] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b4e5661-7ca7-4b9c-a538-7689b088bdaf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.343923] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae5feae4-c4ab-4637-8756-539dd642b1f1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.363484] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 723.383060] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 723.412680] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 723.412878] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.475s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.497821] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468099, 'name': CreateVM_Task, 'duration_secs': 0.564617} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 723.497995] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 723.498756] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 723.498918] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 723.499269] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 723.499522] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-21bed376-5b5d-4cb5-a0e7-465195e947c8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.504734] env[67977]: DEBUG oslo_vmware.api [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Waiting for the task: (returnval){ [ 723.504734] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523fcea9-ffb4-cd43-7864-495f7084faad" [ 723.504734] env[67977]: _type = "Task" [ 723.504734] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 723.513131] env[67977]: DEBUG oslo_vmware.api [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523fcea9-ffb4-cd43-7864-495f7084faad, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 723.582086] env[67977]: DEBUG nova.compute.manager [req-06c18cc9-af01-48d4-b186-0145a3b69dde req-17a9ad16-459d-4ccf-88ab-3a757994edfc service nova] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Received event network-vif-plugged-1653050f-2f2d-488c-b821-669397edcf6e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 723.582086] env[67977]: DEBUG oslo_concurrency.lockutils [req-06c18cc9-af01-48d4-b186-0145a3b69dde req-17a9ad16-459d-4ccf-88ab-3a757994edfc service nova] Acquiring lock "f6e698af-6d7e-40d5-988b-450f300b67a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 723.582086] env[67977]: DEBUG oslo_concurrency.lockutils [req-06c18cc9-af01-48d4-b186-0145a3b69dde req-17a9ad16-459d-4ccf-88ab-3a757994edfc service nova] Lock "f6e698af-6d7e-40d5-988b-450f300b67a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 723.582086] env[67977]: DEBUG oslo_concurrency.lockutils [req-06c18cc9-af01-48d4-b186-0145a3b69dde req-17a9ad16-459d-4ccf-88ab-3a757994edfc service nova] Lock "f6e698af-6d7e-40d5-988b-450f300b67a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.582309] env[67977]: DEBUG nova.compute.manager [req-06c18cc9-af01-48d4-b186-0145a3b69dde req-17a9ad16-459d-4ccf-88ab-3a757994edfc service nova] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] No waiting events found dispatching network-vif-plugged-1653050f-2f2d-488c-b821-669397edcf6e {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 723.582528] env[67977]: WARNING nova.compute.manager [req-06c18cc9-af01-48d4-b186-0145a3b69dde req-17a9ad16-459d-4ccf-88ab-3a757994edfc service nova] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Received unexpected event network-vif-plugged-1653050f-2f2d-488c-b821-669397edcf6e for instance with vm_state building and task_state spawning. [ 724.022169] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 724.022487] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 724.022760] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 724.269058] env[67977]: DEBUG nova.network.neutron [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Successfully updated port: 1653050f-2f2d-488c-b821-669397edcf6e {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 724.284267] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquiring lock "refresh_cache-f6e698af-6d7e-40d5-988b-450f300b67a1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 724.285118] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquired lock "refresh_cache-f6e698af-6d7e-40d5-988b-450f300b67a1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 724.287315] env[67977]: DEBUG nova.network.neutron [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 724.806200] env[67977]: DEBUG nova.network.neutron [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 724.999943] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquiring lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 724.999943] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 725.014360] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 725.102343] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 725.102714] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 725.104463] env[67977]: INFO nova.compute.claims [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 725.400819] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be853ed7-f039-44ce-8090-f25597459617 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 725.414244] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00bd0261-ffcd-4f30-b1ca-e4919823466d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 725.455469] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d21d402-faab-4f88-8042-924658fcfda2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 725.463994] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-919051ee-4d4b-4da3-8d51-a818ff550f57 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 725.485634] env[67977]: DEBUG nova.compute.provider_tree [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 725.493883] env[67977]: DEBUG nova.compute.manager [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Received event network-vif-plugged-ca351fe7-626f-4a6b-a218-ff68c1ea7f99 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 725.494016] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] Acquiring lock "b623a2f1-404e-4f48-aeb2-ebb372260a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 725.494485] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] Lock "b623a2f1-404e-4f48-aeb2-ebb372260a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 725.494709] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] Lock "b623a2f1-404e-4f48-aeb2-ebb372260a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 725.494882] env[67977]: DEBUG nova.compute.manager [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] No waiting events found dispatching network-vif-plugged-ca351fe7-626f-4a6b-a218-ff68c1ea7f99 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 725.495045] env[67977]: WARNING nova.compute.manager [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Received unexpected event network-vif-plugged-ca351fe7-626f-4a6b-a218-ff68c1ea7f99 for instance with vm_state building and task_state spawning. [ 725.495223] env[67977]: DEBUG nova.compute.manager [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Received event network-changed-6b969739-8f65-4d46-90c2-1e4f0b80f466 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 725.495377] env[67977]: DEBUG nova.compute.manager [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Refreshing instance network info cache due to event network-changed-6b969739-8f65-4d46-90c2-1e4f0b80f466. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 725.495566] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] Acquiring lock "refresh_cache-04e59d76-a2d5-482c-90a0-fcb407c0bd4e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 725.495702] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] Acquired lock "refresh_cache-04e59d76-a2d5-482c-90a0-fcb407c0bd4e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 725.495849] env[67977]: DEBUG nova.network.neutron [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Refreshing network info cache for port 6b969739-8f65-4d46-90c2-1e4f0b80f466 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 725.499872] env[67977]: DEBUG nova.scheduler.client.report [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 725.528259] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.424s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 725.528259] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 725.585158] env[67977]: DEBUG nova.compute.utils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 725.586938] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 725.587688] env[67977]: DEBUG nova.network.neutron [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 725.604515] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 725.735298] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 725.770891] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 725.771155] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 725.771346] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 725.771594] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 725.771677] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 725.771857] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 725.772035] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 725.775438] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 725.775513] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 725.775713] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 725.775874] env[67977]: DEBUG nova.virt.hardware [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 725.777158] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bca48865-2060-4dbc-bb64-6764793a636c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 725.789377] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-290d06cb-009e-4412-8d98-a86a00b6e6e8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 725.954249] env[67977]: DEBUG nova.policy [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3d035152573845adab3df33367cc5900', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a9224bb08444f6eba3fc60e5b6a0849', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 726.521785] env[67977]: DEBUG nova.network.neutron [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Updating instance_info_cache with network_info: [{"id": "1653050f-2f2d-488c-b821-669397edcf6e", "address": "fa:16:3e:ce:80:50", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.59", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1653050f-2f", "ovs_interfaceid": "1653050f-2f2d-488c-b821-669397edcf6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.540675] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Releasing lock "refresh_cache-f6e698af-6d7e-40d5-988b-450f300b67a1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 726.541060] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Instance network_info: |[{"id": "1653050f-2f2d-488c-b821-669397edcf6e", "address": "fa:16:3e:ce:80:50", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.59", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1653050f-2f", "ovs_interfaceid": "1653050f-2f2d-488c-b821-669397edcf6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 726.541710] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ce:80:50', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dbd2870d-a51d-472a-8034-1b3e132b5cb6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1653050f-2f2d-488c-b821-669397edcf6e', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 726.551596] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Creating folder: Project (8ac48fabcd394452844d1d65c0a22c4e). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 726.552593] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bb4e8165-dc3d-4381-9ea7-a2cb094963be {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.127361] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Created folder: Project (8ac48fabcd394452844d1d65c0a22c4e) in parent group-v693022. [ 727.127688] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Creating folder: Instances. Parent ref: group-v693039. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 727.128084] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-febcacbf-20a2-4248-a80d-422db0c056e6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.138396] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Created folder: Instances in parent group-v693039. [ 727.138634] env[67977]: DEBUG oslo.service.loopingcall [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 727.138816] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 727.139021] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dedd91b5-45f8-423a-92b0-6f1edd8c4686 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.160680] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 727.160680] env[67977]: value = "task-3468104" [ 727.160680] env[67977]: _type = "Task" [ 727.160680] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 727.170083] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468104, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 727.616751] env[67977]: DEBUG nova.network.neutron [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Updated VIF entry in instance network info cache for port 6b969739-8f65-4d46-90c2-1e4f0b80f466. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 727.617238] env[67977]: DEBUG nova.network.neutron [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Updating instance_info_cache with network_info: [{"id": "6b969739-8f65-4d46-90c2-1e4f0b80f466", "address": "fa:16:3e:33:32:f0", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.249", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b969739-8f", "ovs_interfaceid": "6b969739-8f65-4d46-90c2-1e4f0b80f466", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.630714] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] Releasing lock "refresh_cache-04e59d76-a2d5-482c-90a0-fcb407c0bd4e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 727.630981] env[67977]: DEBUG nova.compute.manager [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Received event network-changed-ca351fe7-626f-4a6b-a218-ff68c1ea7f99 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 727.631168] env[67977]: DEBUG nova.compute.manager [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Refreshing instance network info cache due to event network-changed-ca351fe7-626f-4a6b-a218-ff68c1ea7f99. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 727.631386] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] Acquiring lock "refresh_cache-b623a2f1-404e-4f48-aeb2-ebb372260a86" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 727.631580] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] Acquired lock "refresh_cache-b623a2f1-404e-4f48-aeb2-ebb372260a86" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 727.631772] env[67977]: DEBUG nova.network.neutron [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Refreshing network info cache for port ca351fe7-626f-4a6b-a218-ff68c1ea7f99 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 727.673886] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468104, 'name': CreateVM_Task, 'duration_secs': 0.39481} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 727.675674] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 727.676515] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 727.676677] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 727.676994] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 727.677279] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ffa582f7-c22b-4df5-b287-e4319f4e3f43 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.684329] env[67977]: DEBUG oslo_vmware.api [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Waiting for the task: (returnval){ [ 727.684329] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ef5bb1-7611-e970-b7d8-6dc5e16a8ca8" [ 727.684329] env[67977]: _type = "Task" [ 727.684329] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 727.698901] env[67977]: DEBUG oslo_vmware.api [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ef5bb1-7611-e970-b7d8-6dc5e16a8ca8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 728.025367] env[67977]: DEBUG nova.compute.manager [req-4498fe27-9c99-47c6-8726-1eefda9bfc08 req-8fe7ce87-091e-499a-b7ac-599c6ac328fe service nova] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Received event network-changed-1653050f-2f2d-488c-b821-669397edcf6e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 728.025367] env[67977]: DEBUG nova.compute.manager [req-4498fe27-9c99-47c6-8726-1eefda9bfc08 req-8fe7ce87-091e-499a-b7ac-599c6ac328fe service nova] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Refreshing instance network info cache due to event network-changed-1653050f-2f2d-488c-b821-669397edcf6e. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 728.025367] env[67977]: DEBUG oslo_concurrency.lockutils [req-4498fe27-9c99-47c6-8726-1eefda9bfc08 req-8fe7ce87-091e-499a-b7ac-599c6ac328fe service nova] Acquiring lock "refresh_cache-f6e698af-6d7e-40d5-988b-450f300b67a1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 728.025367] env[67977]: DEBUG oslo_concurrency.lockutils [req-4498fe27-9c99-47c6-8726-1eefda9bfc08 req-8fe7ce87-091e-499a-b7ac-599c6ac328fe service nova] Acquired lock "refresh_cache-f6e698af-6d7e-40d5-988b-450f300b67a1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 728.026375] env[67977]: DEBUG nova.network.neutron [req-4498fe27-9c99-47c6-8726-1eefda9bfc08 req-8fe7ce87-091e-499a-b7ac-599c6ac328fe service nova] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Refreshing network info cache for port 1653050f-2f2d-488c-b821-669397edcf6e {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 728.051297] env[67977]: DEBUG nova.network.neutron [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Successfully updated port: c80b1c1b-4792-4b89-ad1d-bc34ebb6428d {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 728.067071] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquiring lock "refresh_cache-faf24c4e-135e-47df-85a6-05024bc9b64b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 728.067245] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquired lock "refresh_cache-faf24c4e-135e-47df-85a6-05024bc9b64b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 728.067405] env[67977]: DEBUG nova.network.neutron [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.119031] env[67977]: DEBUG nova.network.neutron [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Successfully updated port: cf86095f-6af5-4123-b1f6-c21363bcc825 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 728.132897] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquiring lock "refresh_cache-02dea9f7-00be-4305-909c-ab9245b60e1d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 728.132897] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquired lock "refresh_cache-02dea9f7-00be-4305-909c-ab9245b60e1d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 728.133073] env[67977]: DEBUG nova.network.neutron [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.198985] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 728.198985] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 728.198985] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 728.298607] env[67977]: DEBUG nova.network.neutron [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.491981] env[67977]: DEBUG nova.network.neutron [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.835460] env[67977]: DEBUG nova.network.neutron [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Successfully created port: c65e528f-bbdf-4a22-acfc-7b2389a6943d {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 729.375092] env[67977]: DEBUG nova.network.neutron [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Updated VIF entry in instance network info cache for port ca351fe7-626f-4a6b-a218-ff68c1ea7f99. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 729.375465] env[67977]: DEBUG nova.network.neutron [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Updating instance_info_cache with network_info: [{"id": "ca351fe7-626f-4a6b-a218-ff68c1ea7f99", "address": "fa:16:3e:f8:40:ca", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.136", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapca351fe7-62", "ovs_interfaceid": "ca351fe7-626f-4a6b-a218-ff68c1ea7f99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.387024] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0a93b01-6a5c-4215-be79-e196688ed73f req-a0ccd236-e5a6-4d57-bf47-da3ab3ad87d4 service nova] Releasing lock "refresh_cache-b623a2f1-404e-4f48-aeb2-ebb372260a86" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 729.819513] env[67977]: DEBUG nova.network.neutron [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Updating instance_info_cache with network_info: [{"id": "c80b1c1b-4792-4b89-ad1d-bc34ebb6428d", "address": "fa:16:3e:10:01:33", "network": {"id": "6a3d9985-b5bd-40c8-bb10-039ea9f0e86a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1429590022-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1cb0c68cec744095a75c950deb5bf143", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d12aff80-9d1b-4a67-a470-9c0148b443e3", "external-id": "nsx-vlan-transportzone-784", "segmentation_id": 784, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc80b1c1b-47", "ovs_interfaceid": "c80b1c1b-4792-4b89-ad1d-bc34ebb6428d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.841273] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Releasing lock "refresh_cache-faf24c4e-135e-47df-85a6-05024bc9b64b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 729.841273] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Instance network_info: |[{"id": "c80b1c1b-4792-4b89-ad1d-bc34ebb6428d", "address": "fa:16:3e:10:01:33", "network": {"id": "6a3d9985-b5bd-40c8-bb10-039ea9f0e86a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1429590022-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1cb0c68cec744095a75c950deb5bf143", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d12aff80-9d1b-4a67-a470-9c0148b443e3", "external-id": "nsx-vlan-transportzone-784", "segmentation_id": 784, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc80b1c1b-47", "ovs_interfaceid": "c80b1c1b-4792-4b89-ad1d-bc34ebb6428d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 729.841701] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:10:01:33', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd12aff80-9d1b-4a67-a470-9c0148b443e3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c80b1c1b-4792-4b89-ad1d-bc34ebb6428d', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 729.849545] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Creating folder: Project (1cb0c68cec744095a75c950deb5bf143). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 729.852097] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b1d9a5af-b66c-4926-b3ce-9d02afe2d657 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.862584] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Created folder: Project (1cb0c68cec744095a75c950deb5bf143) in parent group-v693022. [ 729.862801] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Creating folder: Instances. Parent ref: group-v693042. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 729.863059] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dad666aa-69c2-40d8-9b27-d8de69536769 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.874166] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Created folder: Instances in parent group-v693042. [ 729.874428] env[67977]: DEBUG oslo.service.loopingcall [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 729.874805] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 729.875055] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-990445f1-6fbd-4ff1-8bc9-fc065bf26b76 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.896205] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 729.896205] env[67977]: value = "task-3468108" [ 729.896205] env[67977]: _type = "Task" [ 729.896205] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 729.904184] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468108, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 730.079031] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 730.079031] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 730.099063] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 730.161290] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 730.161544] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 730.163452] env[67977]: INFO nova.compute.claims [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 730.339658] env[67977]: DEBUG nova.network.neutron [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Updating instance_info_cache with network_info: [{"id": "cf86095f-6af5-4123-b1f6-c21363bcc825", "address": "fa:16:3e:50:2a:9f", "network": {"id": "77457578-002d-4b2c-bbca-deb3fd25ce90", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1822261757-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "31f0a47096fd4368b4fe144bc2102e7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b2049d7-f99e-425a-afdb-2c95ca88e483", "external-id": "nsx-vlan-transportzone-803", "segmentation_id": 803, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf86095f-6a", "ovs_interfaceid": "cf86095f-6af5-4123-b1f6-c21363bcc825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.359679] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Releasing lock "refresh_cache-02dea9f7-00be-4305-909c-ab9245b60e1d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 730.360235] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Instance network_info: |[{"id": "cf86095f-6af5-4123-b1f6-c21363bcc825", "address": "fa:16:3e:50:2a:9f", "network": {"id": "77457578-002d-4b2c-bbca-deb3fd25ce90", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1822261757-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "31f0a47096fd4368b4fe144bc2102e7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b2049d7-f99e-425a-afdb-2c95ca88e483", "external-id": "nsx-vlan-transportzone-803", "segmentation_id": 803, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf86095f-6a", "ovs_interfaceid": "cf86095f-6af5-4123-b1f6-c21363bcc825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 730.360882] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:50:2a:9f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7b2049d7-f99e-425a-afdb-2c95ca88e483', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cf86095f-6af5-4123-b1f6-c21363bcc825', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 730.372144] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Creating folder: Project (31f0a47096fd4368b4fe144bc2102e7a). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 730.373445] env[67977]: DEBUG nova.network.neutron [req-4498fe27-9c99-47c6-8726-1eefda9bfc08 req-8fe7ce87-091e-499a-b7ac-599c6ac328fe service nova] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Updated VIF entry in instance network info cache for port 1653050f-2f2d-488c-b821-669397edcf6e. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 730.373541] env[67977]: DEBUG nova.network.neutron [req-4498fe27-9c99-47c6-8726-1eefda9bfc08 req-8fe7ce87-091e-499a-b7ac-599c6ac328fe service nova] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Updating instance_info_cache with network_info: [{"id": "1653050f-2f2d-488c-b821-669397edcf6e", "address": "fa:16:3e:ce:80:50", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.59", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1653050f-2f", "ovs_interfaceid": "1653050f-2f2d-488c-b821-669397edcf6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.377373] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ceee9d6a-6fcb-41f2-9ade-f7514fd6a0f2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.389384] env[67977]: DEBUG oslo_concurrency.lockutils [req-4498fe27-9c99-47c6-8726-1eefda9bfc08 req-8fe7ce87-091e-499a-b7ac-599c6ac328fe service nova] Releasing lock "refresh_cache-f6e698af-6d7e-40d5-988b-450f300b67a1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 730.393951] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Created folder: Project (31f0a47096fd4368b4fe144bc2102e7a) in parent group-v693022. [ 730.394122] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Creating folder: Instances. Parent ref: group-v693045. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 730.397818] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-edef3312-fc94-42a6-8be2-f109a86dd653 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.412470] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468108, 'name': CreateVM_Task, 'duration_secs': 0.453978} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 730.414452] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 730.414645] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Created folder: Instances in parent group-v693045. [ 730.416452] env[67977]: DEBUG oslo.service.loopingcall [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 730.416452] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-960493fb-ec63-446d-9aaa-20bdef0991a4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.419300] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 730.419390] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 730.420016] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 730.420016] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 730.420895] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f18fbf8-4c6c-4fcb-8e62-4cd38e56bc0e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.422637] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4f9fd83d-2f55-44f4-88a2-46e35fc0a250 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.445563] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca862ee4-1d7c-4355-9a2e-d8b0fd728ac8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.452365] env[67977]: DEBUG oslo_vmware.api [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Waiting for the task: (returnval){ [ 730.452365] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5232e9e4-2c25-ec73-0b90-fa263bfd295f" [ 730.452365] env[67977]: _type = "Task" [ 730.452365] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 730.486033] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 730.486033] env[67977]: value = "task-3468111" [ 730.486033] env[67977]: _type = "Task" [ 730.486033] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 730.488669] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c815f06e-10ef-43a7-9122-4cf1b097cc81 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.498298] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 730.498722] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 730.499058] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 730.506087] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62970679-ac97-42fe-8ad2-2465d70ed95a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.510137] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468111, 'name': CreateVM_Task} progress is 25%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 730.525830] env[67977]: DEBUG nova.compute.provider_tree [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.537449] env[67977]: DEBUG nova.scheduler.client.report [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 730.558773] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.397s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 730.559323] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 730.618062] env[67977]: DEBUG nova.compute.utils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 730.619604] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 730.619790] env[67977]: DEBUG nova.network.neutron [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 730.650931] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 730.808645] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 730.830579] env[67977]: DEBUG nova.policy [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9e43d60a79de4d2a95a45715763e913a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '724dbeaf6dbd4d5cb70f81dd6b3f3ba7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 730.844208] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 730.844208] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 730.845376] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 730.846685] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 730.846685] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 730.846685] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 730.846685] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 730.846685] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 730.846902] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 730.846936] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 730.847139] env[67977]: DEBUG nova.virt.hardware [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 730.848197] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-908a6367-f43e-47e2-b033-43df990aa292 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.859619] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0792977a-947f-4892-8531-ba75f20bf924 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.999300] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468111, 'name': CreateVM_Task, 'duration_secs': 0.344898} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 730.999510] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 731.000179] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 731.000352] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 731.000705] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 731.000966] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-16e471de-8998-470f-aff2-47c37a9b7adc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.005657] env[67977]: DEBUG oslo_vmware.api [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Waiting for the task: (returnval){ [ 731.005657] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d85024-969f-b3b8-11e9-b48b36162d6b" [ 731.005657] env[67977]: _type = "Task" [ 731.005657] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 731.014342] env[67977]: DEBUG oslo_vmware.api [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d85024-969f-b3b8-11e9-b48b36162d6b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 731.518461] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 731.518750] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 731.519396] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 731.857339] env[67977]: DEBUG nova.network.neutron [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Successfully created port: e66cbb47-68bc-47c2-8326-3a3a0058a24f {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 732.678618] env[67977]: DEBUG nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Received event network-vif-plugged-cf86095f-6af5-4123-b1f6-c21363bcc825 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 732.678618] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Acquiring lock "02dea9f7-00be-4305-909c-ab9245b60e1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.679028] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Lock "02dea9f7-00be-4305-909c-ab9245b60e1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.679348] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Lock "02dea9f7-00be-4305-909c-ab9245b60e1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 732.679437] env[67977]: DEBUG nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] No waiting events found dispatching network-vif-plugged-cf86095f-6af5-4123-b1f6-c21363bcc825 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 732.680545] env[67977]: WARNING nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Received unexpected event network-vif-plugged-cf86095f-6af5-4123-b1f6-c21363bcc825 for instance with vm_state building and task_state spawning. [ 732.681287] env[67977]: DEBUG nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Received event network-vif-plugged-c80b1c1b-4792-4b89-ad1d-bc34ebb6428d {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 732.681287] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Acquiring lock "faf24c4e-135e-47df-85a6-05024bc9b64b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.682134] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Lock "faf24c4e-135e-47df-85a6-05024bc9b64b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.682209] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Lock "faf24c4e-135e-47df-85a6-05024bc9b64b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 732.682530] env[67977]: DEBUG nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] No waiting events found dispatching network-vif-plugged-c80b1c1b-4792-4b89-ad1d-bc34ebb6428d {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 732.682561] env[67977]: WARNING nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Received unexpected event network-vif-plugged-c80b1c1b-4792-4b89-ad1d-bc34ebb6428d for instance with vm_state building and task_state spawning. [ 732.682912] env[67977]: DEBUG nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Received event network-changed-c80b1c1b-4792-4b89-ad1d-bc34ebb6428d {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 732.682912] env[67977]: DEBUG nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Refreshing instance network info cache due to event network-changed-c80b1c1b-4792-4b89-ad1d-bc34ebb6428d. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 732.683078] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Acquiring lock "refresh_cache-faf24c4e-135e-47df-85a6-05024bc9b64b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 732.683197] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Acquired lock "refresh_cache-faf24c4e-135e-47df-85a6-05024bc9b64b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 732.683353] env[67977]: DEBUG nova.network.neutron [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Refreshing network info cache for port c80b1c1b-4792-4b89-ad1d-bc34ebb6428d {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 733.214164] env[67977]: DEBUG nova.network.neutron [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Successfully updated port: c65e528f-bbdf-4a22-acfc-7b2389a6943d {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 733.232000] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquiring lock "refresh_cache-a2fd776e-9a01-4b67-bc23-1605d6e2b23e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 733.232136] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquired lock "refresh_cache-a2fd776e-9a01-4b67-bc23-1605d6e2b23e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 733.232267] env[67977]: DEBUG nova.network.neutron [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 733.384880] env[67977]: DEBUG nova.network.neutron [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.087375] env[67977]: DEBUG nova.network.neutron [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Updating instance_info_cache with network_info: [{"id": "c65e528f-bbdf-4a22-acfc-7b2389a6943d", "address": "fa:16:3e:52:2b:dc", "network": {"id": "905d297a-71f8-4582-94d5-1ffc5cf9edae", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-2117133888-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a9224bb08444f6eba3fc60e5b6a0849", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "171aeae0-6a27-44fc-bc3d-a2d5581fc702", "external-id": "nsx-vlan-transportzone-410", "segmentation_id": 410, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc65e528f-bb", "ovs_interfaceid": "c65e528f-bbdf-4a22-acfc-7b2389a6943d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.108257] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Releasing lock "refresh_cache-a2fd776e-9a01-4b67-bc23-1605d6e2b23e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 734.108600] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Instance network_info: |[{"id": "c65e528f-bbdf-4a22-acfc-7b2389a6943d", "address": "fa:16:3e:52:2b:dc", "network": {"id": "905d297a-71f8-4582-94d5-1ffc5cf9edae", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-2117133888-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a9224bb08444f6eba3fc60e5b6a0849", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "171aeae0-6a27-44fc-bc3d-a2d5581fc702", "external-id": "nsx-vlan-transportzone-410", "segmentation_id": 410, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc65e528f-bb", "ovs_interfaceid": "c65e528f-bbdf-4a22-acfc-7b2389a6943d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 734.109018] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:52:2b:dc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '171aeae0-6a27-44fc-bc3d-a2d5581fc702', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c65e528f-bbdf-4a22-acfc-7b2389a6943d', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 734.119469] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Creating folder: Project (7a9224bb08444f6eba3fc60e5b6a0849). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 734.119469] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-41da00da-7d30-42ce-bc05-c647d414685b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.133390] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 734.133547] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 734.137981] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Created folder: Project (7a9224bb08444f6eba3fc60e5b6a0849) in parent group-v693022. [ 734.137981] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Creating folder: Instances. Parent ref: group-v693048. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 734.137981] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1ed4f39e-64a6-4b2e-b198-b0ba1aec47fc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.145785] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Created folder: Instances in parent group-v693048. [ 734.146039] env[67977]: DEBUG oslo.service.loopingcall [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 734.146748] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 734.147093] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 734.157714] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-eeb1fd3c-7ac7-4265-9290-a7b9f15302e7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.178054] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 734.178054] env[67977]: value = "task-3468114" [ 734.178054] env[67977]: _type = "Task" [ 734.178054] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 734.186579] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468114, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 734.215296] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 734.215587] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 734.217165] env[67977]: INFO nova.compute.claims [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 734.444487] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9f88f3a-eaa5-4b8e-9eaf-a9f2ebb526c4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.452681] env[67977]: DEBUG nova.network.neutron [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Updated VIF entry in instance network info cache for port c80b1c1b-4792-4b89-ad1d-bc34ebb6428d. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 734.453557] env[67977]: DEBUG nova.network.neutron [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Updating instance_info_cache with network_info: [{"id": "c80b1c1b-4792-4b89-ad1d-bc34ebb6428d", "address": "fa:16:3e:10:01:33", "network": {"id": "6a3d9985-b5bd-40c8-bb10-039ea9f0e86a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1429590022-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1cb0c68cec744095a75c950deb5bf143", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d12aff80-9d1b-4a67-a470-9c0148b443e3", "external-id": "nsx-vlan-transportzone-784", "segmentation_id": 784, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc80b1c1b-47", "ovs_interfaceid": "c80b1c1b-4792-4b89-ad1d-bc34ebb6428d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.455281] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4125b391-cc42-4384-a900-cede697a0306 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.493138] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b700348-1fea-4010-9e49-6d02dab37e2d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.496528] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Releasing lock "refresh_cache-faf24c4e-135e-47df-85a6-05024bc9b64b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 734.496772] env[67977]: DEBUG nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Received event network-changed-cf86095f-6af5-4123-b1f6-c21363bcc825 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 734.496940] env[67977]: DEBUG nova.compute.manager [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Refreshing instance network info cache due to event network-changed-cf86095f-6af5-4123-b1f6-c21363bcc825. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 734.497164] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Acquiring lock "refresh_cache-02dea9f7-00be-4305-909c-ab9245b60e1d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 734.497342] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Acquired lock "refresh_cache-02dea9f7-00be-4305-909c-ab9245b60e1d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 734.497523] env[67977]: DEBUG nova.network.neutron [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Refreshing network info cache for port cf86095f-6af5-4123-b1f6-c21363bcc825 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 734.504855] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9395f775-61bb-4e8f-9420-06581b2d99af {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.519553] env[67977]: DEBUG nova.compute.provider_tree [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 734.537889] env[67977]: DEBUG nova.scheduler.client.report [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 734.567264] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 734.567947] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 734.642740] env[67977]: DEBUG nova.compute.utils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 734.644954] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 734.645099] env[67977]: DEBUG nova.network.neutron [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 734.661245] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 734.698725] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468114, 'name': CreateVM_Task, 'duration_secs': 0.461977} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 734.698725] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 734.699120] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 734.699917] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 734.699917] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 734.700103] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f39a1c3-e7b8-4cd0-b32e-38be3e198de7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.706798] env[67977]: DEBUG oslo_vmware.api [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Waiting for the task: (returnval){ [ 734.706798] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]527084bb-09ff-225e-541d-5eb177fae2ea" [ 734.706798] env[67977]: _type = "Task" [ 734.706798] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 734.719576] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 734.719964] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 734.720810] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 734.795500] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 734.834020] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 734.834623] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 734.834623] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 734.834623] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 734.834809] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 734.834809] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 734.835016] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 734.836166] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 734.836445] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 734.836603] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 734.836889] env[67977]: DEBUG nova.virt.hardware [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 734.838345] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72a781ca-cc9b-4c45-af18-cc83b337d940 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.850193] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00606bf7-edc8-4c8d-8cea-398d2d41ed6f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.903148] env[67977]: DEBUG nova.policy [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9e43d60a79de4d2a95a45715763e913a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '724dbeaf6dbd4d5cb70f81dd6b3f3ba7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 735.130052] env[67977]: DEBUG nova.network.neutron [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Successfully updated port: e66cbb47-68bc-47c2-8326-3a3a0058a24f {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 735.144312] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "refresh_cache-3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 735.144312] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquired lock "refresh_cache-3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 735.144481] env[67977]: DEBUG nova.network.neutron [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 735.483114] env[67977]: DEBUG nova.network.neutron [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.050983] env[67977]: DEBUG nova.network.neutron [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Updated VIF entry in instance network info cache for port cf86095f-6af5-4123-b1f6-c21363bcc825. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 736.051355] env[67977]: DEBUG nova.network.neutron [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Updating instance_info_cache with network_info: [{"id": "cf86095f-6af5-4123-b1f6-c21363bcc825", "address": "fa:16:3e:50:2a:9f", "network": {"id": "77457578-002d-4b2c-bbca-deb3fd25ce90", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1822261757-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "31f0a47096fd4368b4fe144bc2102e7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b2049d7-f99e-425a-afdb-2c95ca88e483", "external-id": "nsx-vlan-transportzone-803", "segmentation_id": 803, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf86095f-6a", "ovs_interfaceid": "cf86095f-6af5-4123-b1f6-c21363bcc825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.072569] env[67977]: DEBUG oslo_concurrency.lockutils [req-f35beeaf-4572-480e-bd04-e1fc804e8630 req-052f3685-7473-4a48-92e2-45b4a63f005f service nova] Releasing lock "refresh_cache-02dea9f7-00be-4305-909c-ab9245b60e1d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 736.317454] env[67977]: DEBUG nova.network.neutron [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Successfully created port: dec86def-dbdd-4517-a6ad-9796fbaf3b6b {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 736.352984] env[67977]: DEBUG nova.compute.manager [req-8e70548e-930e-4f7c-bfba-c5424c9f28df req-133ab40a-dd29-4350-b72c-abfb3dddd729 service nova] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Received event network-vif-plugged-c65e528f-bbdf-4a22-acfc-7b2389a6943d {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 736.352984] env[67977]: DEBUG oslo_concurrency.lockutils [req-8e70548e-930e-4f7c-bfba-c5424c9f28df req-133ab40a-dd29-4350-b72c-abfb3dddd729 service nova] Acquiring lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 736.353594] env[67977]: DEBUG oslo_concurrency.lockutils [req-8e70548e-930e-4f7c-bfba-c5424c9f28df req-133ab40a-dd29-4350-b72c-abfb3dddd729 service nova] Lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 736.353925] env[67977]: DEBUG oslo_concurrency.lockutils [req-8e70548e-930e-4f7c-bfba-c5424c9f28df req-133ab40a-dd29-4350-b72c-abfb3dddd729 service nova] Lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 736.354364] env[67977]: DEBUG nova.compute.manager [req-8e70548e-930e-4f7c-bfba-c5424c9f28df req-133ab40a-dd29-4350-b72c-abfb3dddd729 service nova] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] No waiting events found dispatching network-vif-plugged-c65e528f-bbdf-4a22-acfc-7b2389a6943d {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 736.354414] env[67977]: WARNING nova.compute.manager [req-8e70548e-930e-4f7c-bfba-c5424c9f28df req-133ab40a-dd29-4350-b72c-abfb3dddd729 service nova] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Received unexpected event network-vif-plugged-c65e528f-bbdf-4a22-acfc-7b2389a6943d for instance with vm_state building and task_state spawning. [ 736.360993] env[67977]: DEBUG nova.network.neutron [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Updating instance_info_cache with network_info: [{"id": "e66cbb47-68bc-47c2-8326-3a3a0058a24f", "address": "fa:16:3e:fd:c6:52", "network": {"id": "95ab1f5e-fcb9-4039-ace5-a3f1b5a1dcdc", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1164682355-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "724dbeaf6dbd4d5cb70f81dd6b3f3ba7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed8a78a1-87dc-488e-a092-afd1c2a2ddde", "external-id": "nsx-vlan-transportzone-21", "segmentation_id": 21, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape66cbb47-68", "ovs_interfaceid": "e66cbb47-68bc-47c2-8326-3a3a0058a24f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.377545] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Releasing lock "refresh_cache-3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 736.380014] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Instance network_info: |[{"id": "e66cbb47-68bc-47c2-8326-3a3a0058a24f", "address": "fa:16:3e:fd:c6:52", "network": {"id": "95ab1f5e-fcb9-4039-ace5-a3f1b5a1dcdc", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1164682355-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "724dbeaf6dbd4d5cb70f81dd6b3f3ba7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed8a78a1-87dc-488e-a092-afd1c2a2ddde", "external-id": "nsx-vlan-transportzone-21", "segmentation_id": 21, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape66cbb47-68", "ovs_interfaceid": "e66cbb47-68bc-47c2-8326-3a3a0058a24f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 736.380217] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fd:c6:52', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed8a78a1-87dc-488e-a092-afd1c2a2ddde', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e66cbb47-68bc-47c2-8326-3a3a0058a24f', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 736.388329] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Creating folder: Project (724dbeaf6dbd4d5cb70f81dd6b3f3ba7). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 736.389407] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-affe6bbc-0db4-4189-ab79-e9349a7ecd6c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 736.406271] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Created folder: Project (724dbeaf6dbd4d5cb70f81dd6b3f3ba7) in parent group-v693022. [ 736.406271] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Creating folder: Instances. Parent ref: group-v693051. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 736.406271] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bbfd15b3-2d6e-4117-9631-617bd408fa4c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 736.413862] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Created folder: Instances in parent group-v693051. [ 736.414265] env[67977]: DEBUG oslo.service.loopingcall [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 736.414527] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 736.414701] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5b62f0d8-1921-4f78-8eb8-cb2594fc6816 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 736.438998] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 736.438998] env[67977]: value = "task-3468117" [ 736.438998] env[67977]: _type = "Task" [ 736.438998] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 736.449685] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468117, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 736.955556] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468117, 'name': CreateVM_Task} progress is 99%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 737.193645] env[67977]: DEBUG oslo_concurrency.lockutils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Acquiring lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 737.194054] env[67977]: DEBUG oslo_concurrency.lockutils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 737.453054] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468117, 'name': CreateVM_Task, 'duration_secs': 0.532102} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 737.453442] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 737.455715] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 737.455715] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 737.455715] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 737.455715] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1d9209a4-0bc1-441f-9cd1-60b7f61471ac {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.461170] env[67977]: DEBUG oslo_vmware.api [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for the task: (returnval){ [ 737.461170] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525e085b-1c7b-48d8-029e-575b143300a3" [ 737.461170] env[67977]: _type = "Task" [ 737.461170] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 737.469742] env[67977]: DEBUG oslo_vmware.api [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525e085b-1c7b-48d8-029e-575b143300a3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 737.973463] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 737.973463] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 737.973857] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 738.603981] env[67977]: DEBUG nova.compute.manager [req-527e5a58-0341-4bd0-a3fe-b1e709184d1d req-700ced04-79ff-4053-9e7c-7934d64b7ea3 service nova] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Received event network-vif-plugged-e66cbb47-68bc-47c2-8326-3a3a0058a24f {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 738.604276] env[67977]: DEBUG oslo_concurrency.lockutils [req-527e5a58-0341-4bd0-a3fe-b1e709184d1d req-700ced04-79ff-4053-9e7c-7934d64b7ea3 service nova] Acquiring lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 738.605205] env[67977]: DEBUG oslo_concurrency.lockutils [req-527e5a58-0341-4bd0-a3fe-b1e709184d1d req-700ced04-79ff-4053-9e7c-7934d64b7ea3 service nova] Lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 738.605205] env[67977]: DEBUG oslo_concurrency.lockutils [req-527e5a58-0341-4bd0-a3fe-b1e709184d1d req-700ced04-79ff-4053-9e7c-7934d64b7ea3 service nova] Lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 738.605205] env[67977]: DEBUG nova.compute.manager [req-527e5a58-0341-4bd0-a3fe-b1e709184d1d req-700ced04-79ff-4053-9e7c-7934d64b7ea3 service nova] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] No waiting events found dispatching network-vif-plugged-e66cbb47-68bc-47c2-8326-3a3a0058a24f {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 738.605205] env[67977]: WARNING nova.compute.manager [req-527e5a58-0341-4bd0-a3fe-b1e709184d1d req-700ced04-79ff-4053-9e7c-7934d64b7ea3 service nova] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Received unexpected event network-vif-plugged-e66cbb47-68bc-47c2-8326-3a3a0058a24f for instance with vm_state building and task_state spawning. [ 738.966199] env[67977]: DEBUG nova.network.neutron [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Successfully updated port: dec86def-dbdd-4517-a6ad-9796fbaf3b6b {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 738.986300] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "refresh_cache-af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 738.986572] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquired lock "refresh_cache-af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 738.986609] env[67977]: DEBUG nova.network.neutron [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 739.076413] env[67977]: DEBUG nova.network.neutron [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.504149] env[67977]: DEBUG nova.network.neutron [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Updating instance_info_cache with network_info: [{"id": "dec86def-dbdd-4517-a6ad-9796fbaf3b6b", "address": "fa:16:3e:92:5e:02", "network": {"id": "95ab1f5e-fcb9-4039-ace5-a3f1b5a1dcdc", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1164682355-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "724dbeaf6dbd4d5cb70f81dd6b3f3ba7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed8a78a1-87dc-488e-a092-afd1c2a2ddde", "external-id": "nsx-vlan-transportzone-21", "segmentation_id": 21, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdec86def-db", "ovs_interfaceid": "dec86def-dbdd-4517-a6ad-9796fbaf3b6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.525496] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Releasing lock "refresh_cache-af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 739.525496] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Instance network_info: |[{"id": "dec86def-dbdd-4517-a6ad-9796fbaf3b6b", "address": "fa:16:3e:92:5e:02", "network": {"id": "95ab1f5e-fcb9-4039-ace5-a3f1b5a1dcdc", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1164682355-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "724dbeaf6dbd4d5cb70f81dd6b3f3ba7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed8a78a1-87dc-488e-a092-afd1c2a2ddde", "external-id": "nsx-vlan-transportzone-21", "segmentation_id": 21, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdec86def-db", "ovs_interfaceid": "dec86def-dbdd-4517-a6ad-9796fbaf3b6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 739.526273] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:92:5e:02', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed8a78a1-87dc-488e-a092-afd1c2a2ddde', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dec86def-dbdd-4517-a6ad-9796fbaf3b6b', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 739.535563] env[67977]: DEBUG oslo.service.loopingcall [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 739.536145] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 739.536367] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2b3caf8f-5c9b-408c-a8f1-c6fa8f679a9f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.562647] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 739.562647] env[67977]: value = "task-3468118" [ 739.562647] env[67977]: _type = "Task" [ 739.562647] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 739.567725] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468118, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 740.070451] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468118, 'name': CreateVM_Task, 'duration_secs': 0.303529} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 740.070711] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 740.073575] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 740.073575] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 740.073575] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 740.073575] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ce40df27-82e6-433b-b80e-e68ebe612429 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.078902] env[67977]: DEBUG oslo_vmware.api [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for the task: (returnval){ [ 740.078902] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]521b5d88-be70-78c1-f176-12472200d2fb" [ 740.078902] env[67977]: _type = "Task" [ 740.078902] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 740.089086] env[67977]: DEBUG oslo_vmware.api [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]521b5d88-be70-78c1-f176-12472200d2fb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 740.257250] env[67977]: DEBUG nova.compute.manager [req-cd510dc9-3454-407f-ade3-01f97cc8b7ec req-af96344b-6979-46cb-9a81-fdc261a36fb7 service nova] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Received event network-changed-c65e528f-bbdf-4a22-acfc-7b2389a6943d {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 740.257250] env[67977]: DEBUG nova.compute.manager [req-cd510dc9-3454-407f-ade3-01f97cc8b7ec req-af96344b-6979-46cb-9a81-fdc261a36fb7 service nova] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Refreshing instance network info cache due to event network-changed-c65e528f-bbdf-4a22-acfc-7b2389a6943d. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 740.257250] env[67977]: DEBUG oslo_concurrency.lockutils [req-cd510dc9-3454-407f-ade3-01f97cc8b7ec req-af96344b-6979-46cb-9a81-fdc261a36fb7 service nova] Acquiring lock "refresh_cache-a2fd776e-9a01-4b67-bc23-1605d6e2b23e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 740.257250] env[67977]: DEBUG oslo_concurrency.lockutils [req-cd510dc9-3454-407f-ade3-01f97cc8b7ec req-af96344b-6979-46cb-9a81-fdc261a36fb7 service nova] Acquired lock "refresh_cache-a2fd776e-9a01-4b67-bc23-1605d6e2b23e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 740.257250] env[67977]: DEBUG nova.network.neutron [req-cd510dc9-3454-407f-ade3-01f97cc8b7ec req-af96344b-6979-46cb-9a81-fdc261a36fb7 service nova] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Refreshing network info cache for port c65e528f-bbdf-4a22-acfc-7b2389a6943d {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 740.591765] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 740.592455] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 740.592455] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 740.902673] env[67977]: DEBUG nova.network.neutron [req-cd510dc9-3454-407f-ade3-01f97cc8b7ec req-af96344b-6979-46cb-9a81-fdc261a36fb7 service nova] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Updated VIF entry in instance network info cache for port c65e528f-bbdf-4a22-acfc-7b2389a6943d. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 740.902673] env[67977]: DEBUG nova.network.neutron [req-cd510dc9-3454-407f-ade3-01f97cc8b7ec req-af96344b-6979-46cb-9a81-fdc261a36fb7 service nova] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Updating instance_info_cache with network_info: [{"id": "c65e528f-bbdf-4a22-acfc-7b2389a6943d", "address": "fa:16:3e:52:2b:dc", "network": {"id": "905d297a-71f8-4582-94d5-1ffc5cf9edae", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-2117133888-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a9224bb08444f6eba3fc60e5b6a0849", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "171aeae0-6a27-44fc-bc3d-a2d5581fc702", "external-id": "nsx-vlan-transportzone-410", "segmentation_id": 410, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc65e528f-bb", "ovs_interfaceid": "c65e528f-bbdf-4a22-acfc-7b2389a6943d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.920310] env[67977]: DEBUG oslo_concurrency.lockutils [req-cd510dc9-3454-407f-ade3-01f97cc8b7ec req-af96344b-6979-46cb-9a81-fdc261a36fb7 service nova] Releasing lock "refresh_cache-a2fd776e-9a01-4b67-bc23-1605d6e2b23e" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 741.066138] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquiring lock "b22ae1a7-c9b8-464b-a81c-73144a0176be" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 741.066391] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "b22ae1a7-c9b8-464b-a81c-73144a0176be" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 741.629712] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquiring lock "83b04c8c-39f6-4f58-b965-0a94c063b68b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 741.630232] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.766499] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.766499] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.803561] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.804403] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.830264] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "b017d568-1ad8-4d8d-84e8-5771341389bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.830510] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "b017d568-1ad8-4d8d-84e8-5771341389bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 743.064167] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c82ff3b-f1a7-4838-bede-750271ea1ecd tempest-FloatingIPsAssociationNegativeTestJSON-865865848 tempest-FloatingIPsAssociationNegativeTestJSON-865865848-project-member] Acquiring lock "ee6c409a-0d32-48fa-a873-b9b62040aef7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 743.064306] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c82ff3b-f1a7-4838-bede-750271ea1ecd tempest-FloatingIPsAssociationNegativeTestJSON-865865848 tempest-FloatingIPsAssociationNegativeTestJSON-865865848-project-member] Lock "ee6c409a-0d32-48fa-a873-b9b62040aef7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 743.614053] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c660dc3a-9f41-4637-9299-21ad45adf48e tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquiring lock "7db91c79-1cdb-4101-a369-583b8bbae870" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 743.614053] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c660dc3a-9f41-4637-9299-21ad45adf48e tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "7db91c79-1cdb-4101-a369-583b8bbae870" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 743.643868] env[67977]: DEBUG nova.compute.manager [req-3e4ab7b4-599b-452c-9a22-64f39cd82021 req-5ad6c1ef-75d4-45c4-bcec-d0d2c71d9a88 service nova] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Received event network-changed-e66cbb47-68bc-47c2-8326-3a3a0058a24f {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 743.643868] env[67977]: DEBUG nova.compute.manager [req-3e4ab7b4-599b-452c-9a22-64f39cd82021 req-5ad6c1ef-75d4-45c4-bcec-d0d2c71d9a88 service nova] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Refreshing instance network info cache due to event network-changed-e66cbb47-68bc-47c2-8326-3a3a0058a24f. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 743.644894] env[67977]: DEBUG oslo_concurrency.lockutils [req-3e4ab7b4-599b-452c-9a22-64f39cd82021 req-5ad6c1ef-75d4-45c4-bcec-d0d2c71d9a88 service nova] Acquiring lock "refresh_cache-3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 743.644894] env[67977]: DEBUG oslo_concurrency.lockutils [req-3e4ab7b4-599b-452c-9a22-64f39cd82021 req-5ad6c1ef-75d4-45c4-bcec-d0d2c71d9a88 service nova] Acquired lock "refresh_cache-3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 743.645227] env[67977]: DEBUG nova.network.neutron [req-3e4ab7b4-599b-452c-9a22-64f39cd82021 req-5ad6c1ef-75d4-45c4-bcec-d0d2c71d9a88 service nova] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Refreshing network info cache for port e66cbb47-68bc-47c2-8326-3a3a0058a24f {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 744.214015] env[67977]: DEBUG nova.network.neutron [req-3e4ab7b4-599b-452c-9a22-64f39cd82021 req-5ad6c1ef-75d4-45c4-bcec-d0d2c71d9a88 service nova] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Updated VIF entry in instance network info cache for port e66cbb47-68bc-47c2-8326-3a3a0058a24f. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 744.214015] env[67977]: DEBUG nova.network.neutron [req-3e4ab7b4-599b-452c-9a22-64f39cd82021 req-5ad6c1ef-75d4-45c4-bcec-d0d2c71d9a88 service nova] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Updating instance_info_cache with network_info: [{"id": "e66cbb47-68bc-47c2-8326-3a3a0058a24f", "address": "fa:16:3e:fd:c6:52", "network": {"id": "95ab1f5e-fcb9-4039-ace5-a3f1b5a1dcdc", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1164682355-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "724dbeaf6dbd4d5cb70f81dd6b3f3ba7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed8a78a1-87dc-488e-a092-afd1c2a2ddde", "external-id": "nsx-vlan-transportzone-21", "segmentation_id": 21, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape66cbb47-68", "ovs_interfaceid": "e66cbb47-68bc-47c2-8326-3a3a0058a24f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.230911] env[67977]: DEBUG oslo_concurrency.lockutils [req-3e4ab7b4-599b-452c-9a22-64f39cd82021 req-5ad6c1ef-75d4-45c4-bcec-d0d2c71d9a88 service nova] Releasing lock "refresh_cache-3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 744.503714] env[67977]: DEBUG nova.compute.manager [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Received event network-vif-plugged-dec86def-dbdd-4517-a6ad-9796fbaf3b6b {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 744.503940] env[67977]: DEBUG oslo_concurrency.lockutils [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] Acquiring lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 744.504457] env[67977]: DEBUG oslo_concurrency.lockutils [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] Lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 744.504832] env[67977]: DEBUG oslo_concurrency.lockutils [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] Lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 744.505254] env[67977]: DEBUG nova.compute.manager [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] No waiting events found dispatching network-vif-plugged-dec86def-dbdd-4517-a6ad-9796fbaf3b6b {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 744.505254] env[67977]: WARNING nova.compute.manager [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Received unexpected event network-vif-plugged-dec86def-dbdd-4517-a6ad-9796fbaf3b6b for instance with vm_state building and task_state spawning. [ 744.505343] env[67977]: DEBUG nova.compute.manager [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Received event network-changed-dec86def-dbdd-4517-a6ad-9796fbaf3b6b {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 744.505482] env[67977]: DEBUG nova.compute.manager [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Refreshing instance network info cache due to event network-changed-dec86def-dbdd-4517-a6ad-9796fbaf3b6b. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 744.505665] env[67977]: DEBUG oslo_concurrency.lockutils [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] Acquiring lock "refresh_cache-af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 744.505799] env[67977]: DEBUG oslo_concurrency.lockutils [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] Acquired lock "refresh_cache-af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 744.505953] env[67977]: DEBUG nova.network.neutron [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Refreshing network info cache for port dec86def-dbdd-4517-a6ad-9796fbaf3b6b {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 744.818222] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7e45e376-a36f-4839-a332-410ca11ed964 tempest-ServerExternalEventsTest-819465719 tempest-ServerExternalEventsTest-819465719-project-member] Acquiring lock "e1027e0e-7938-4772-84c2-f879e9ce4144" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 744.818222] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7e45e376-a36f-4839-a332-410ca11ed964 tempest-ServerExternalEventsTest-819465719 tempest-ServerExternalEventsTest-819465719-project-member] Lock "e1027e0e-7938-4772-84c2-f879e9ce4144" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 745.256929] env[67977]: DEBUG nova.network.neutron [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Updated VIF entry in instance network info cache for port dec86def-dbdd-4517-a6ad-9796fbaf3b6b. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 745.256929] env[67977]: DEBUG nova.network.neutron [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Updating instance_info_cache with network_info: [{"id": "dec86def-dbdd-4517-a6ad-9796fbaf3b6b", "address": "fa:16:3e:92:5e:02", "network": {"id": "95ab1f5e-fcb9-4039-ace5-a3f1b5a1dcdc", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1164682355-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "724dbeaf6dbd4d5cb70f81dd6b3f3ba7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed8a78a1-87dc-488e-a092-afd1c2a2ddde", "external-id": "nsx-vlan-transportzone-21", "segmentation_id": 21, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdec86def-db", "ovs_interfaceid": "dec86def-dbdd-4517-a6ad-9796fbaf3b6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.268432] env[67977]: DEBUG oslo_concurrency.lockutils [req-de5daa4e-1d8e-4a60-8114-eca4e546b380 req-f0863503-0b5d-4d5d-bde5-d8612ca52700 service nova] Releasing lock "refresh_cache-af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 745.619933] env[67977]: DEBUG oslo_concurrency.lockutils [None req-850f3328-669b-44f9-bf28-bee378ec3316 tempest-ServerAddressesNegativeTestJSON-1982447062 tempest-ServerAddressesNegativeTestJSON-1982447062-project-member] Acquiring lock "3963518a-23de-434e-9f88-392a80daf120" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 745.620183] env[67977]: DEBUG oslo_concurrency.lockutils [None req-850f3328-669b-44f9-bf28-bee378ec3316 tempest-ServerAddressesNegativeTestJSON-1982447062 tempest-ServerAddressesNegativeTestJSON-1982447062-project-member] Lock "3963518a-23de-434e-9f88-392a80daf120" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 746.930083] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4e5c801c-f63d-455e-abe6-da5996ab2bd1 tempest-VolumesAssistedSnapshotsTest-1609573494 tempest-VolumesAssistedSnapshotsTest-1609573494-project-member] Acquiring lock "b462ac2e-d668-4ac1-a6b3-2cfb49a2e0cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 746.930409] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4e5c801c-f63d-455e-abe6-da5996ab2bd1 tempest-VolumesAssistedSnapshotsTest-1609573494 tempest-VolumesAssistedSnapshotsTest-1609573494-project-member] Lock "b462ac2e-d668-4ac1-a6b3-2cfb49a2e0cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 752.398121] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6ce1fff8-9173-4d48-8e91-86a16d3be6e9 tempest-ServersWithSpecificFlavorTestJSON-2118342104 tempest-ServersWithSpecificFlavorTestJSON-2118342104-project-member] Acquiring lock "8870c8cf-bf83-482d-91a9-47fdedc79586" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 752.398121] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6ce1fff8-9173-4d48-8e91-86a16d3be6e9 tempest-ServersWithSpecificFlavorTestJSON-2118342104 tempest-ServersWithSpecificFlavorTestJSON-2118342104-project-member] Lock "8870c8cf-bf83-482d-91a9-47fdedc79586" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 757.758118] env[67977]: DEBUG oslo_concurrency.lockutils [None req-939b9d0b-252f-4f4e-b159-2fb267c491f9 tempest-ServerDiagnosticsTest-1479388244 tempest-ServerDiagnosticsTest-1479388244-project-member] Acquiring lock "bb8b7561-424e-48ba-9faa-65d6f6465a20" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 757.758394] env[67977]: DEBUG oslo_concurrency.lockutils [None req-939b9d0b-252f-4f4e-b159-2fb267c491f9 tempest-ServerDiagnosticsTest-1479388244 tempest-ServerDiagnosticsTest-1479388244-project-member] Lock "bb8b7561-424e-48ba-9faa-65d6f6465a20" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 758.989285] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8273f9d9-aa1c-4d37-8c7f-7b17bde62d7c tempest-AttachInterfacesV270Test-1989404753 tempest-AttachInterfacesV270Test-1989404753-project-member] Acquiring lock "d6893024-9531-435b-8893-38f310224d7b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 758.989820] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8273f9d9-aa1c-4d37-8c7f-7b17bde62d7c tempest-AttachInterfacesV270Test-1989404753 tempest-AttachInterfacesV270Test-1989404753-project-member] Lock "d6893024-9531-435b-8893-38f310224d7b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 759.918800] env[67977]: DEBUG oslo_concurrency.lockutils [None req-abeac5ed-1bd2-4ec7-91ee-5fab80fd36c9 tempest-ServersTestJSON-1266629660 tempest-ServersTestJSON-1266629660-project-member] Acquiring lock "0d6c2ea5-71ff-49bb-ae23-3323e9b4a3f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 759.918800] env[67977]: DEBUG oslo_concurrency.lockutils [None req-abeac5ed-1bd2-4ec7-91ee-5fab80fd36c9 tempest-ServersTestJSON-1266629660 tempest-ServersTestJSON-1266629660-project-member] Lock "0d6c2ea5-71ff-49bb-ae23-3323e9b4a3f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 761.000489] env[67977]: DEBUG oslo_concurrency.lockutils [None req-827b8bbb-c3bf-43cb-a135-ab3275182ba2 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquiring lock "72a209af-5976-4943-9752-8c258bb24158" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 761.000868] env[67977]: DEBUG oslo_concurrency.lockutils [None req-827b8bbb-c3bf-43cb-a135-ab3275182ba2 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "72a209af-5976-4943-9752-8c258bb24158" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 761.270581] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b59f81c3-59c6-487a-a1d4-f0c4b6f3c5f0 tempest-ServersAdminNegativeTestJSON-393042103 tempest-ServersAdminNegativeTestJSON-393042103-project-member] Acquiring lock "21a172f7-20d4-4f17-af4d-cadc0fa33c1d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 761.270825] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b59f81c3-59c6-487a-a1d4-f0c4b6f3c5f0 tempest-ServersAdminNegativeTestJSON-393042103 tempest-ServersAdminNegativeTestJSON-393042103-project-member] Lock "21a172f7-20d4-4f17-af4d-cadc0fa33c1d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 762.813750] env[67977]: WARNING oslo_vmware.rw_handles [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 762.813750] env[67977]: ERROR oslo_vmware.rw_handles [ 762.813750] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/41c4bdce-1831-4291-a1fd-7e3acdc6075c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 762.815345] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 762.815345] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Copying Virtual Disk [datastore1] vmware_temp/41c4bdce-1831-4291-a1fd-7e3acdc6075c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/41c4bdce-1831-4291-a1fd-7e3acdc6075c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 762.815345] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0f8b67c5-109c-46eb-b600-536026026742 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.824485] env[67977]: DEBUG oslo_vmware.api [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Waiting for the task: (returnval){ [ 762.824485] env[67977]: value = "task-3468119" [ 762.824485] env[67977]: _type = "Task" [ 762.824485] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 762.833198] env[67977]: DEBUG oslo_vmware.api [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Task: {'id': task-3468119, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 763.336386] env[67977]: DEBUG oslo_vmware.exceptions [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 763.336671] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 763.339811] env[67977]: ERROR nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 763.339811] env[67977]: Faults: ['InvalidArgument'] [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Traceback (most recent call last): [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] yield resources [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] self.driver.spawn(context, instance, image_meta, [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] self._vmops.spawn(context, instance, image_meta, injected_files, [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] self._fetch_image_if_missing(context, vi) [ 763.339811] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] image_cache(vi, tmp_image_ds_loc) [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] vm_util.copy_virtual_disk( [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] session._wait_for_task(vmdk_copy_task) [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] return self.wait_for_task(task_ref) [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] return evt.wait() [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] result = hub.switch() [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 763.340695] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] return self.greenlet.switch() [ 763.342197] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 763.342197] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] self.f(*self.args, **self.kw) [ 763.342197] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 763.342197] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] raise exceptions.translate_fault(task_info.error) [ 763.342197] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 763.342197] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Faults: ['InvalidArgument'] [ 763.342197] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] [ 763.342197] env[67977]: INFO nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Terminating instance [ 763.342473] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 763.342717] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 763.344604] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquiring lock "refresh_cache-7900e978-def6-4636-a2cb-94c322a23d15" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 763.344776] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquired lock "refresh_cache-7900e978-def6-4636-a2cb-94c322a23d15" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 763.344948] env[67977]: DEBUG nova.network.neutron [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 763.345956] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-571e6520-f7e4-4c68-a45a-c270845eb3d5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.362256] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 763.362256] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 763.362256] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa94701d-ed86-44e2-bf41-51e8fa76c952 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.366377] env[67977]: DEBUG oslo_vmware.api [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Waiting for the task: (returnval){ [ 763.366377] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]521bd5a2-98b2-add3-8c93-637d140bcde9" [ 763.366377] env[67977]: _type = "Task" [ 763.366377] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 763.374526] env[67977]: DEBUG oslo_vmware.api [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]521bd5a2-98b2-add3-8c93-637d140bcde9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 763.409269] env[67977]: DEBUG nova.network.neutron [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 763.620777] env[67977]: DEBUG nova.network.neutron [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.636990] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Releasing lock "refresh_cache-7900e978-def6-4636-a2cb-94c322a23d15" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 763.637400] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 763.637543] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 763.639418] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92b2adc1-c928-49e0-8ab0-1bc5f03828ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.648204] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 763.648449] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8dedc524-f243-420c-bd42-ed08da30532f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.684095] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 763.684349] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 763.684549] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Deleting the datastore file [datastore1] 7900e978-def6-4636-a2cb-94c322a23d15 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 763.684813] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ea221281-b501-450d-a2f4-63cf88b415c8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.691333] env[67977]: DEBUG oslo_vmware.api [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Waiting for the task: (returnval){ [ 763.691333] env[67977]: value = "task-3468121" [ 763.691333] env[67977]: _type = "Task" [ 763.691333] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 763.702234] env[67977]: DEBUG oslo_vmware.api [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Task: {'id': task-3468121, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 763.882052] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 763.882390] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Creating directory with path [datastore1] vmware_temp/c98d0374-dc9e-48a1-b097-59763559e059/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 763.882646] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-016216a9-d196-4243-b4a8-37c9f36aff59 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.896294] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Created directory with path [datastore1] vmware_temp/c98d0374-dc9e-48a1-b097-59763559e059/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 763.896913] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Fetch image to [datastore1] vmware_temp/c98d0374-dc9e-48a1-b097-59763559e059/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 763.897029] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/c98d0374-dc9e-48a1-b097-59763559e059/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 763.897841] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d2f2179-903f-4992-913b-4dec37b6287e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.909483] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9310e382-9532-4f2f-aad2-c05b54fb4856 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.919347] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-727361a0-0a58-4aec-90bd-2fc2274ef538 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.955151] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb92537f-85d9-4677-80f6-b312d409ec67 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.962384] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4a9b43dd-e0e2-4d50-90fc-85570ba75541 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.983351] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 764.050085] env[67977]: DEBUG oslo_vmware.rw_handles [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c98d0374-dc9e-48a1-b097-59763559e059/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 764.115743] env[67977]: DEBUG oslo_vmware.rw_handles [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 764.115743] env[67977]: DEBUG oslo_vmware.rw_handles [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c98d0374-dc9e-48a1-b097-59763559e059/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 764.203919] env[67977]: DEBUG oslo_vmware.api [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Task: {'id': task-3468121, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067377} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 764.207090] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 764.207090] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 764.207090] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 764.207090] env[67977]: INFO nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Took 0.57 seconds to destroy the instance on the hypervisor. [ 764.207090] env[67977]: DEBUG oslo.service.loopingcall [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 764.207590] env[67977]: DEBUG nova.compute.manager [-] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Skipping network deallocation for instance since networking was not requested. {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 764.209090] env[67977]: DEBUG nova.compute.claims [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 764.211962] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 764.211962] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 764.785637] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5a24e19-6dba-4c6e-84d0-4037938abdff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.798196] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c412284d-d095-4bd1-9ad8-05db0f2f6e11 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.828036] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e9722b3-7a20-44c2-9e3f-bd3e738d6da6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.835454] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fea135f8-b785-4bf9-9caf-826ad23d203d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.849025] env[67977]: DEBUG nova.compute.provider_tree [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 764.861403] env[67977]: DEBUG nova.scheduler.client.report [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 764.886692] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.676s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 764.886692] env[67977]: ERROR nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 764.886692] env[67977]: Faults: ['InvalidArgument'] [ 764.886692] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Traceback (most recent call last): [ 764.886692] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 764.886692] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] self.driver.spawn(context, instance, image_meta, [ 764.886692] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 764.886692] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] self._vmops.spawn(context, instance, image_meta, injected_files, [ 764.886692] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 764.886692] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] self._fetch_image_if_missing(context, vi) [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] image_cache(vi, tmp_image_ds_loc) [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] vm_util.copy_virtual_disk( [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] session._wait_for_task(vmdk_copy_task) [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] return self.wait_for_task(task_ref) [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] return evt.wait() [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] result = hub.switch() [ 764.887298] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 764.887644] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] return self.greenlet.switch() [ 764.887644] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 764.887644] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] self.f(*self.args, **self.kw) [ 764.887644] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 764.887644] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] raise exceptions.translate_fault(task_info.error) [ 764.887644] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 764.887644] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Faults: ['InvalidArgument'] [ 764.887644] env[67977]: ERROR nova.compute.manager [instance: 7900e978-def6-4636-a2cb-94c322a23d15] [ 764.887644] env[67977]: DEBUG nova.compute.utils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 764.893591] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Build of instance 7900e978-def6-4636-a2cb-94c322a23d15 was re-scheduled: A specified parameter was not correct: fileType [ 764.893591] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 764.893998] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 764.894252] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquiring lock "refresh_cache-7900e978-def6-4636-a2cb-94c322a23d15" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 764.894376] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Acquired lock "refresh_cache-7900e978-def6-4636-a2cb-94c322a23d15" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 764.894564] env[67977]: DEBUG nova.network.neutron [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 764.933556] env[67977]: DEBUG nova.network.neutron [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 765.082161] env[67977]: DEBUG nova.network.neutron [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.096074] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Releasing lock "refresh_cache-7900e978-def6-4636-a2cb-94c322a23d15" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 765.096369] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 765.096500] env[67977]: DEBUG nova.compute.manager [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] [instance: 7900e978-def6-4636-a2cb-94c322a23d15] Skipping network deallocation for instance since networking was not requested. {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 765.221524] env[67977]: INFO nova.scheduler.client.report [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Deleted allocations for instance 7900e978-def6-4636-a2cb-94c322a23d15 [ 765.255293] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4cb8f1cc-b726-4a54-b04d-c4467555310c tempest-ServersAdmin275Test-1481269565 tempest-ServersAdmin275Test-1481269565-project-member] Lock "7900e978-def6-4636-a2cb-94c322a23d15" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 54.885s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 765.283166] env[67977]: DEBUG nova.compute.manager [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 765.347093] env[67977]: DEBUG oslo_concurrency.lockutils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 765.347356] env[67977]: DEBUG oslo_concurrency.lockutils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 765.350608] env[67977]: INFO nova.compute.claims [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 765.858525] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f91d181c-ca6c-44cc-9ed1-c8c20e4b0b53 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.866884] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b9e4d6e-429b-4157-af8c-cd1483c04935 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.908273] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14c2c164-f15a-4f74-b0fd-4d1d45fdf77a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.915019] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a4d28a6-7de5-4c4d-922a-1205cbc8bad1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.929694] env[67977]: DEBUG nova.compute.provider_tree [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 765.940797] env[67977]: DEBUG nova.scheduler.client.report [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 765.959990] env[67977]: DEBUG oslo_concurrency.lockutils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.611s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 765.959990] env[67977]: DEBUG nova.compute.manager [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 766.012994] env[67977]: DEBUG nova.compute.utils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 766.014564] env[67977]: DEBUG nova.compute.manager [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 766.014741] env[67977]: DEBUG nova.network.neutron [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 766.035370] env[67977]: DEBUG nova.compute.manager [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 766.078032] env[67977]: INFO nova.virt.block_device [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Booting with volume f48842bb-527b-4951-a92b-36a75954ad22 at /dev/sda [ 766.156800] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2e64a0b5-2c60-4b17-bd11-033de2592730 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.166585] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b853fa2d-1000-4a53-97ae-e3324efc35ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.183831] env[67977]: DEBUG nova.policy [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1fab403da94640aa92519f9f639e1d10', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '570bfe0d22ef4aacb477b8cf01505918', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 766.186754] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "d7719b11-cef7-4878-a693-24dcd085a1d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 766.186754] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 766.205134] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-52a7cbeb-1b20-4ec5-bff6-949a49853baf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.214995] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-426ae52d-9181-4416-88a7-fa68fb9708d9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.249332] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74f3d3a-9eb6-4e9e-90f9-3a47f21727ad {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.256190] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c8af318-feaa-4408-856d-7b062461ad84 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.269886] env[67977]: DEBUG nova.virt.block_device [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Updating existing volume attachment record: 0ae30934-b5a3-4a41-8316-0ca5f2100c99 {{(pid=67977) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 767.033609] env[67977]: DEBUG nova.compute.manager [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 767.033609] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 767.033609] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 767.034045] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 767.034045] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 767.034122] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 767.034221] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 767.034446] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 767.035108] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 767.035108] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 767.035108] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 767.035108] env[67977]: DEBUG nova.virt.hardware [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 767.036663] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-623179fa-8690-4739-a6ad-d7747ace4f17 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.048493] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63478f23-bd97-4551-a9eb-47a424f62a3f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.255666] env[67977]: DEBUG nova.network.neutron [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Successfully created port: 84523293-e6ef-40e3-9c1a-8eefa9c39d13 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 769.009007] env[67977]: DEBUG nova.network.neutron [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Successfully updated port: 84523293-e6ef-40e3-9c1a-8eefa9c39d13 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 769.032201] env[67977]: DEBUG oslo_concurrency.lockutils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Acquiring lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 769.032362] env[67977]: DEBUG oslo_concurrency.lockutils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Acquired lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 769.032509] env[67977]: DEBUG nova.network.neutron [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 769.122110] env[67977]: DEBUG nova.network.neutron [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 769.480522] env[67977]: DEBUG nova.network.neutron [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Updating instance_info_cache with network_info: [{"id": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "address": "fa:16:3e:bf:5b:ef", "network": {"id": "b3179acf-cc67-477c-b095-6ec83690b6ae", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1388201138-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "570bfe0d22ef4aacb477b8cf01505918", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9c1b8689-a9b4-4972-beb9-6a1c8de1dc88", "external-id": "nsx-vlan-transportzone-455", "segmentation_id": 455, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap84523293-e6", "ovs_interfaceid": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.498887] env[67977]: DEBUG oslo_concurrency.lockutils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Releasing lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 769.498887] env[67977]: DEBUG nova.compute.manager [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Instance network_info: |[{"id": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "address": "fa:16:3e:bf:5b:ef", "network": {"id": "b3179acf-cc67-477c-b095-6ec83690b6ae", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1388201138-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "570bfe0d22ef4aacb477b8cf01505918", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9c1b8689-a9b4-4972-beb9-6a1c8de1dc88", "external-id": "nsx-vlan-transportzone-455", "segmentation_id": 455, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap84523293-e6", "ovs_interfaceid": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 769.499410] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bf:5b:ef', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9c1b8689-a9b4-4972-beb9-6a1c8de1dc88', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '84523293-e6ef-40e3-9c1a-8eefa9c39d13', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 769.507130] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Creating folder: Project (570bfe0d22ef4aacb477b8cf01505918). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 769.507714] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1e5588e6-49b3-4b0c-8c9f-f26df7060429 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.520914] env[67977]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 769.521090] env[67977]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=67977) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 769.521450] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Folder already exists: Project (570bfe0d22ef4aacb477b8cf01505918). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 769.522025] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Creating folder: Instances. Parent ref: group-v693023. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 769.522025] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e557bcfc-3d57-434c-b64b-ca0173486d03 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.531308] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Created folder: Instances in parent group-v693023. [ 769.531626] env[67977]: DEBUG oslo.service.loopingcall [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 769.531751] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 769.531938] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-956220cd-cb61-4107-9a4a-e45c1f73f0e2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.554408] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 769.554408] env[67977]: value = "task-3468124" [ 769.554408] env[67977]: _type = "Task" [ 769.554408] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 769.563988] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468124, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 769.566313] env[67977]: DEBUG nova.compute.manager [req-22496b6f-8a30-47fd-bf39-81fc32f526cb req-3d0586fd-1c94-4f7e-9641-657486f8e04f service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Received event network-vif-plugged-84523293-e6ef-40e3-9c1a-8eefa9c39d13 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 769.566690] env[67977]: DEBUG oslo_concurrency.lockutils [req-22496b6f-8a30-47fd-bf39-81fc32f526cb req-3d0586fd-1c94-4f7e-9641-657486f8e04f service nova] Acquiring lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 769.569773] env[67977]: DEBUG oslo_concurrency.lockutils [req-22496b6f-8a30-47fd-bf39-81fc32f526cb req-3d0586fd-1c94-4f7e-9641-657486f8e04f service nova] Lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 769.569773] env[67977]: DEBUG oslo_concurrency.lockutils [req-22496b6f-8a30-47fd-bf39-81fc32f526cb req-3d0586fd-1c94-4f7e-9641-657486f8e04f service nova] Lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 769.569773] env[67977]: DEBUG nova.compute.manager [req-22496b6f-8a30-47fd-bf39-81fc32f526cb req-3d0586fd-1c94-4f7e-9641-657486f8e04f service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] No waiting events found dispatching network-vif-plugged-84523293-e6ef-40e3-9c1a-8eefa9c39d13 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 769.569773] env[67977]: WARNING nova.compute.manager [req-22496b6f-8a30-47fd-bf39-81fc32f526cb req-3d0586fd-1c94-4f7e-9641-657486f8e04f service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Received unexpected event network-vif-plugged-84523293-e6ef-40e3-9c1a-8eefa9c39d13 for instance with vm_state building and task_state spawning. [ 770.070227] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468124, 'name': CreateVM_Task, 'duration_secs': 0.35296} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 770.070663] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 770.071239] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': None, 'attachment_id': '0ae30934-b5a3-4a41-8316-0ca5f2100c99', 'disk_bus': None, 'mount_device': '/dev/sda', 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693037', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'name': 'volume-f48842bb-527b-4951-a92b-36a75954ad22', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1b5b8be5-7e9c-4269-994a-e54aeb75774f', 'attached_at': '', 'detached_at': '', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'serial': 'f48842bb-527b-4951-a92b-36a75954ad22'}, 'guest_format': None, 'boot_index': 0, 'delete_on_termination': True, 'volume_type': None}], 'swap': None} {{(pid=67977) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 770.071672] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Root volume attach. Driver type: vmdk {{(pid=67977) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 770.072906] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-986d7bd4-25e6-4e17-9275-e2ed7f61f6f7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.081819] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c3ba120-7e54-469f-ac7b-803614fb9c47 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.088762] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db733726-f27a-461a-8e3a-d948e6539859 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.099139] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-67c7fd84-fdc2-44ff-9dd2-e77bedf3fa42 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.106493] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for the task: (returnval){ [ 770.106493] env[67977]: value = "task-3468125" [ 770.106493] env[67977]: _type = "Task" [ 770.106493] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 770.114916] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468125, 'name': RelocateVM_Task} progress is 5%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 770.622459] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468125, 'name': RelocateVM_Task, 'duration_secs': 0.360532} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 770.622796] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Volume attach. Driver type: vmdk {{(pid=67977) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 770.623067] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693037', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'name': 'volume-f48842bb-527b-4951-a92b-36a75954ad22', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1b5b8be5-7e9c-4269-994a-e54aeb75774f', 'attached_at': '', 'detached_at': '', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'serial': 'f48842bb-527b-4951-a92b-36a75954ad22'} {{(pid=67977) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 770.624078] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0cd2ac4-4a31-4d89-a676-7fea9c02d262 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.645695] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea70b6af-a806-4272-9d00-406fe29f2d05 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.672538] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Reconfiguring VM instance instance-0000000b to attach disk [datastore1] volume-f48842bb-527b-4951-a92b-36a75954ad22/volume-f48842bb-527b-4951-a92b-36a75954ad22.vmdk or device None with type thin {{(pid=67977) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 770.673035] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-764befd3-95bd-4125-9800-62ea7d27ac65 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.701054] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for the task: (returnval){ [ 770.701054] env[67977]: value = "task-3468126" [ 770.701054] env[67977]: _type = "Task" [ 770.701054] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 770.710331] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468126, 'name': ReconfigVM_Task} progress is 6%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 771.216471] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468126, 'name': ReconfigVM_Task, 'duration_secs': 0.265452} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 771.216790] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Reconfigured VM instance instance-0000000b to attach disk [datastore1] volume-f48842bb-527b-4951-a92b-36a75954ad22/volume-f48842bb-527b-4951-a92b-36a75954ad22.vmdk or device None with type thin {{(pid=67977) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 771.222321] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-c2c36b46-28dd-4ab4-a023-21938a810938 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 771.240651] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for the task: (returnval){ [ 771.240651] env[67977]: value = "task-3468127" [ 771.240651] env[67977]: _type = "Task" [ 771.240651] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 771.249873] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468127, 'name': ReconfigVM_Task} progress is 6%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 771.753710] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468127, 'name': ReconfigVM_Task} progress is 14%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 772.259078] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468127, 'name': ReconfigVM_Task, 'duration_secs': 0.790071} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 772.259440] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693037', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'name': 'volume-f48842bb-527b-4951-a92b-36a75954ad22', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1b5b8be5-7e9c-4269-994a-e54aeb75774f', 'attached_at': '', 'detached_at': '', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'serial': 'f48842bb-527b-4951-a92b-36a75954ad22'} {{(pid=67977) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 772.260261] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-f1f69c3f-cad1-4930-b4b3-a5b545aff44c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.266237] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for the task: (returnval){ [ 772.266237] env[67977]: value = "task-3468128" [ 772.266237] env[67977]: _type = "Task" [ 772.266237] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 772.275623] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468128, 'name': Rename_Task} progress is 5%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 772.314191] env[67977]: DEBUG nova.compute.manager [req-575a19b0-a75b-4630-ab79-9eb46347a710 req-2d6a0c72-e0c7-4982-bafc-12e557445170 service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Received event network-changed-84523293-e6ef-40e3-9c1a-8eefa9c39d13 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 772.314191] env[67977]: DEBUG nova.compute.manager [req-575a19b0-a75b-4630-ab79-9eb46347a710 req-2d6a0c72-e0c7-4982-bafc-12e557445170 service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Refreshing instance network info cache due to event network-changed-84523293-e6ef-40e3-9c1a-8eefa9c39d13. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 772.314191] env[67977]: DEBUG oslo_concurrency.lockutils [req-575a19b0-a75b-4630-ab79-9eb46347a710 req-2d6a0c72-e0c7-4982-bafc-12e557445170 service nova] Acquiring lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 772.314191] env[67977]: DEBUG oslo_concurrency.lockutils [req-575a19b0-a75b-4630-ab79-9eb46347a710 req-2d6a0c72-e0c7-4982-bafc-12e557445170 service nova] Acquired lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 772.314191] env[67977]: DEBUG nova.network.neutron [req-575a19b0-a75b-4630-ab79-9eb46347a710 req-2d6a0c72-e0c7-4982-bafc-12e557445170 service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Refreshing network info cache for port 84523293-e6ef-40e3-9c1a-8eefa9c39d13 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 772.784385] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468128, 'name': Rename_Task, 'duration_secs': 0.128062} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 772.785197] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Powering on the VM {{(pid=67977) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 772.785197] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-26aca218-60f7-46ca-8970-5e548989bac2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.791372] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for the task: (returnval){ [ 772.791372] env[67977]: value = "task-3468132" [ 772.791372] env[67977]: _type = "Task" [ 772.791372] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 772.801589] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468132, 'name': PowerOnVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 772.871839] env[67977]: DEBUG nova.network.neutron [req-575a19b0-a75b-4630-ab79-9eb46347a710 req-2d6a0c72-e0c7-4982-bafc-12e557445170 service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Updated VIF entry in instance network info cache for port 84523293-e6ef-40e3-9c1a-8eefa9c39d13. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 772.872146] env[67977]: DEBUG nova.network.neutron [req-575a19b0-a75b-4630-ab79-9eb46347a710 req-2d6a0c72-e0c7-4982-bafc-12e557445170 service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Updating instance_info_cache with network_info: [{"id": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "address": "fa:16:3e:bf:5b:ef", "network": {"id": "b3179acf-cc67-477c-b095-6ec83690b6ae", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1388201138-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "570bfe0d22ef4aacb477b8cf01505918", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9c1b8689-a9b4-4972-beb9-6a1c8de1dc88", "external-id": "nsx-vlan-transportzone-455", "segmentation_id": 455, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap84523293-e6", "ovs_interfaceid": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 772.886321] env[67977]: DEBUG oslo_concurrency.lockutils [req-575a19b0-a75b-4630-ab79-9eb46347a710 req-2d6a0c72-e0c7-4982-bafc-12e557445170 service nova] Releasing lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 773.302774] env[67977]: DEBUG oslo_vmware.api [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468132, 'name': PowerOnVM_Task, 'duration_secs': 0.463453} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 773.303126] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Powered on the VM {{(pid=67977) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 773.303368] env[67977]: INFO nova.compute.manager [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Took 6.27 seconds to spawn the instance on the hypervisor. [ 773.303657] env[67977]: DEBUG nova.compute.manager [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Checking state {{(pid=67977) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 773.304527] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c93197d-e1f9-4400-a7c6-233f45e09453 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.383986] env[67977]: INFO nova.compute.manager [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Took 8.05 seconds to build instance. [ 773.404197] env[67977]: DEBUG oslo_concurrency.lockutils [None req-81d1fa22-2a63-4ae7-92eb-f405242d0234 tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 36.210s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.415267] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 773.476614] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 773.476614] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 773.478412] env[67977]: INFO nova.compute.claims [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 774.155297] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3054db8b-6274-400d-aeed-2548778e7024 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.169586] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d34d74e-8b38-4254-a18a-1ecf6bd1dfd7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.210686] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48e0e3dc-cb1f-4fbd-b247-b17e0d214d52 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.220321] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dc1658e-7995-453e-8a3f-5a16c0d3bbd3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.237996] env[67977]: DEBUG nova.compute.provider_tree [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 774.250851] env[67977]: DEBUG nova.scheduler.client.report [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 774.275089] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.798s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.276083] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 774.334573] env[67977]: DEBUG nova.compute.utils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 774.336308] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 774.336551] env[67977]: DEBUG nova.network.neutron [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 774.349437] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 774.432661] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 774.460376] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 774.460611] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 774.460767] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 774.460946] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 774.461244] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 774.461450] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 774.461677] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 774.461839] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 774.462016] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 774.462211] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 774.464018] env[67977]: DEBUG nova.virt.hardware [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 774.464018] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abbcd86e-49e9-4525-936c-933f90d0920a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.472158] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b3e8c38-b770-4452-a721-3d9148afbf81 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.489796] env[67977]: DEBUG nova.policy [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17a244c71e33402dab25fb5cd08ac951', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1728e8bd17404a509f3a429f329b52d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 775.283998] env[67977]: DEBUG nova.network.neutron [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Successfully created port: 335f5509-b65d-44ee-8459-cd631dc6c7a6 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 776.412747] env[67977]: DEBUG nova.network.neutron [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Successfully updated port: 335f5509-b65d-44ee-8459-cd631dc6c7a6 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 776.422166] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Acquiring lock "e7543070-519f-470d-b3dd-964b60ce149f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 776.422166] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Lock "e7543070-519f-470d-b3dd-964b60ce149f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 776.428880] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquiring lock "refresh_cache-b22ae1a7-c9b8-464b-a81c-73144a0176be" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 776.430019] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquired lock "refresh_cache-b22ae1a7-c9b8-464b-a81c-73144a0176be" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 776.430019] env[67977]: DEBUG nova.network.neutron [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 776.474051] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Acquiring lock "de7e2949-00a0-4ce7-9a54-c678d8722464" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 776.474321] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Lock "de7e2949-00a0-4ce7-9a54-c678d8722464" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 776.512313] env[67977]: DEBUG nova.network.neutron [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 776.745110] env[67977]: DEBUG nova.network.neutron [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Updating instance_info_cache with network_info: [{"id": "335f5509-b65d-44ee-8459-cd631dc6c7a6", "address": "fa:16:3e:7a:d5:c3", "network": {"id": "1eea5a27-11b1-4a03-9780-4cbf725813c8", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-241272044-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1728e8bd17404a509f3a429f329b52d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b356db78-99c7-4464-822c-fc7e193f7878", "external-id": "nsx-vlan-transportzone-231", "segmentation_id": 231, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap335f5509-b6", "ovs_interfaceid": "335f5509-b65d-44ee-8459-cd631dc6c7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 776.762950] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Releasing lock "refresh_cache-b22ae1a7-c9b8-464b-a81c-73144a0176be" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 776.762950] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Instance network_info: |[{"id": "335f5509-b65d-44ee-8459-cd631dc6c7a6", "address": "fa:16:3e:7a:d5:c3", "network": {"id": "1eea5a27-11b1-4a03-9780-4cbf725813c8", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-241272044-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1728e8bd17404a509f3a429f329b52d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b356db78-99c7-4464-822c-fc7e193f7878", "external-id": "nsx-vlan-transportzone-231", "segmentation_id": 231, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap335f5509-b6", "ovs_interfaceid": "335f5509-b65d-44ee-8459-cd631dc6c7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 776.763438] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7a:d5:c3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b356db78-99c7-4464-822c-fc7e193f7878', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '335f5509-b65d-44ee-8459-cd631dc6c7a6', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 776.771781] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Creating folder: Project (1728e8bd17404a509f3a429f329b52d4). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 776.776218] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-083f419d-9160-4285-986c-834e4738692b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.789019] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Created folder: Project (1728e8bd17404a509f3a429f329b52d4) in parent group-v693022. [ 776.789019] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Creating folder: Instances. Parent ref: group-v693060. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 776.789019] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-05fa258a-203f-43cc-867e-240137514475 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.796464] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Created folder: Instances in parent group-v693060. [ 776.797457] env[67977]: DEBUG oslo.service.loopingcall [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 776.797457] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 776.797457] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cbd42b15-f6ff-4554-afa7-40e7093154c2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.827126] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 776.827126] env[67977]: value = "task-3468136" [ 776.827126] env[67977]: _type = "Task" [ 776.827126] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 776.837181] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468136, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 777.196584] env[67977]: DEBUG oslo_concurrency.lockutils [None req-23cf627c-ebfc-4435-8cda-1fe313b8c36d tempest-ServerTagsTestJSON-879612222 tempest-ServerTagsTestJSON-879612222-project-member] Acquiring lock "4a59ec41-924b-4eb0-a025-4820479d535b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 777.196834] env[67977]: DEBUG oslo_concurrency.lockutils [None req-23cf627c-ebfc-4435-8cda-1fe313b8c36d tempest-ServerTagsTestJSON-879612222 tempest-ServerTagsTestJSON-879612222-project-member] Lock "4a59ec41-924b-4eb0-a025-4820479d535b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 777.337573] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468136, 'name': CreateVM_Task} progress is 99%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 777.543177] env[67977]: DEBUG nova.compute.manager [req-41acc72b-b246-4141-8f62-c44e68f1b623 req-09167645-6d13-4e3a-b674-2472912f3eda service nova] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Received event network-vif-plugged-335f5509-b65d-44ee-8459-cd631dc6c7a6 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 777.543444] env[67977]: DEBUG oslo_concurrency.lockutils [req-41acc72b-b246-4141-8f62-c44e68f1b623 req-09167645-6d13-4e3a-b674-2472912f3eda service nova] Acquiring lock "b22ae1a7-c9b8-464b-a81c-73144a0176be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 777.543608] env[67977]: DEBUG oslo_concurrency.lockutils [req-41acc72b-b246-4141-8f62-c44e68f1b623 req-09167645-6d13-4e3a-b674-2472912f3eda service nova] Lock "b22ae1a7-c9b8-464b-a81c-73144a0176be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 777.543782] env[67977]: DEBUG oslo_concurrency.lockutils [req-41acc72b-b246-4141-8f62-c44e68f1b623 req-09167645-6d13-4e3a-b674-2472912f3eda service nova] Lock "b22ae1a7-c9b8-464b-a81c-73144a0176be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 777.543943] env[67977]: DEBUG nova.compute.manager [req-41acc72b-b246-4141-8f62-c44e68f1b623 req-09167645-6d13-4e3a-b674-2472912f3eda service nova] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] No waiting events found dispatching network-vif-plugged-335f5509-b65d-44ee-8459-cd631dc6c7a6 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 777.545230] env[67977]: WARNING nova.compute.manager [req-41acc72b-b246-4141-8f62-c44e68f1b623 req-09167645-6d13-4e3a-b674-2472912f3eda service nova] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Received unexpected event network-vif-plugged-335f5509-b65d-44ee-8459-cd631dc6c7a6 for instance with vm_state building and task_state spawning. [ 777.838437] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468136, 'name': CreateVM_Task, 'duration_secs': 0.596378} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 777.838620] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 777.839301] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 777.839479] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 777.839780] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 777.840142] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fb6ab1c0-2ff3-489e-bd6c-a0f56f78e55c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.844661] env[67977]: DEBUG oslo_vmware.api [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Waiting for the task: (returnval){ [ 777.844661] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5272cba7-80c0-560b-5e39-def88a1962bf" [ 777.844661] env[67977]: _type = "Task" [ 777.844661] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 777.852229] env[67977]: DEBUG oslo_vmware.api [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5272cba7-80c0-560b-5e39-def88a1962bf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 778.103206] env[67977]: DEBUG nova.compute.manager [req-76220bd7-975e-4072-bc1d-98a05a738d44 req-2d3d2df3-8ccc-41a3-835f-8e5f0470376a service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Received event network-changed-84523293-e6ef-40e3-9c1a-8eefa9c39d13 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 778.103206] env[67977]: DEBUG nova.compute.manager [req-76220bd7-975e-4072-bc1d-98a05a738d44 req-2d3d2df3-8ccc-41a3-835f-8e5f0470376a service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Refreshing instance network info cache due to event network-changed-84523293-e6ef-40e3-9c1a-8eefa9c39d13. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 778.103206] env[67977]: DEBUG oslo_concurrency.lockutils [req-76220bd7-975e-4072-bc1d-98a05a738d44 req-2d3d2df3-8ccc-41a3-835f-8e5f0470376a service nova] Acquiring lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 778.103206] env[67977]: DEBUG oslo_concurrency.lockutils [req-76220bd7-975e-4072-bc1d-98a05a738d44 req-2d3d2df3-8ccc-41a3-835f-8e5f0470376a service nova] Acquired lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 778.103206] env[67977]: DEBUG nova.network.neutron [req-76220bd7-975e-4072-bc1d-98a05a738d44 req-2d3d2df3-8ccc-41a3-835f-8e5f0470376a service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Refreshing network info cache for port 84523293-e6ef-40e3-9c1a-8eefa9c39d13 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 778.358761] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 778.359394] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 778.359912] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 778.459101] env[67977]: DEBUG nova.network.neutron [req-76220bd7-975e-4072-bc1d-98a05a738d44 req-2d3d2df3-8ccc-41a3-835f-8e5f0470376a service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Updated VIF entry in instance network info cache for port 84523293-e6ef-40e3-9c1a-8eefa9c39d13. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 778.459514] env[67977]: DEBUG nova.network.neutron [req-76220bd7-975e-4072-bc1d-98a05a738d44 req-2d3d2df3-8ccc-41a3-835f-8e5f0470376a service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Updating instance_info_cache with network_info: [{"id": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "address": "fa:16:3e:bf:5b:ef", "network": {"id": "b3179acf-cc67-477c-b095-6ec83690b6ae", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1388201138-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.150", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "570bfe0d22ef4aacb477b8cf01505918", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9c1b8689-a9b4-4972-beb9-6a1c8de1dc88", "external-id": "nsx-vlan-transportzone-455", "segmentation_id": 455, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap84523293-e6", "ovs_interfaceid": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 778.472936] env[67977]: DEBUG oslo_concurrency.lockutils [req-76220bd7-975e-4072-bc1d-98a05a738d44 req-2d3d2df3-8ccc-41a3-835f-8e5f0470376a service nova] Releasing lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 779.638303] env[67977]: DEBUG nova.compute.manager [req-67d91e5e-9a0e-420c-9c71-927ccb2380fb req-8602b864-d497-464a-80aa-8d82d2d8b6ae service nova] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Received event network-changed-335f5509-b65d-44ee-8459-cd631dc6c7a6 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 779.638609] env[67977]: DEBUG nova.compute.manager [req-67d91e5e-9a0e-420c-9c71-927ccb2380fb req-8602b864-d497-464a-80aa-8d82d2d8b6ae service nova] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Refreshing instance network info cache due to event network-changed-335f5509-b65d-44ee-8459-cd631dc6c7a6. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 779.638677] env[67977]: DEBUG oslo_concurrency.lockutils [req-67d91e5e-9a0e-420c-9c71-927ccb2380fb req-8602b864-d497-464a-80aa-8d82d2d8b6ae service nova] Acquiring lock "refresh_cache-b22ae1a7-c9b8-464b-a81c-73144a0176be" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 779.638844] env[67977]: DEBUG oslo_concurrency.lockutils [req-67d91e5e-9a0e-420c-9c71-927ccb2380fb req-8602b864-d497-464a-80aa-8d82d2d8b6ae service nova] Acquired lock "refresh_cache-b22ae1a7-c9b8-464b-a81c-73144a0176be" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 779.639056] env[67977]: DEBUG nova.network.neutron [req-67d91e5e-9a0e-420c-9c71-927ccb2380fb req-8602b864-d497-464a-80aa-8d82d2d8b6ae service nova] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Refreshing network info cache for port 335f5509-b65d-44ee-8459-cd631dc6c7a6 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 779.956142] env[67977]: DEBUG nova.network.neutron [req-67d91e5e-9a0e-420c-9c71-927ccb2380fb req-8602b864-d497-464a-80aa-8d82d2d8b6ae service nova] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Updated VIF entry in instance network info cache for port 335f5509-b65d-44ee-8459-cd631dc6c7a6. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 779.956507] env[67977]: DEBUG nova.network.neutron [req-67d91e5e-9a0e-420c-9c71-927ccb2380fb req-8602b864-d497-464a-80aa-8d82d2d8b6ae service nova] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Updating instance_info_cache with network_info: [{"id": "335f5509-b65d-44ee-8459-cd631dc6c7a6", "address": "fa:16:3e:7a:d5:c3", "network": {"id": "1eea5a27-11b1-4a03-9780-4cbf725813c8", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-241272044-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1728e8bd17404a509f3a429f329b52d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b356db78-99c7-4464-822c-fc7e193f7878", "external-id": "nsx-vlan-transportzone-231", "segmentation_id": 231, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap335f5509-b6", "ovs_interfaceid": "335f5509-b65d-44ee-8459-cd631dc6c7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 779.968711] env[67977]: DEBUG oslo_concurrency.lockutils [req-67d91e5e-9a0e-420c-9c71-927ccb2380fb req-8602b864-d497-464a-80aa-8d82d2d8b6ae service nova] Releasing lock "refresh_cache-b22ae1a7-c9b8-464b-a81c-73144a0176be" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 783.393761] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 783.394093] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 783.425929] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 783.775157] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 783.775403] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 783.775594] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 783.775939] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 783.790946] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.791223] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.004s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 783.791400] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 783.791588] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 783.793629] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85e519a6-838d-40c8-91f0-060d18d02243 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 783.803123] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c208d5fa-a030-4c5d-9e51-6caf5274b090 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 783.819919] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80f083e4-deae-4085-9c40-9b8891869b10 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 783.826711] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff92eb9e-a822-4f79-8c94-283f18745b63 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 783.858987] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180880MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 783.859179] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.859426] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 783.965768] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f6e698af-6d7e-40d5-988b-450f300b67a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.966017] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 04e59d76-a2d5-482c-90a0-fcb407c0bd4e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.966562] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b623a2f1-404e-4f48-aeb2-ebb372260a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.966562] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance faf24c4e-135e-47df-85a6-05024bc9b64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.966562] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 02dea9f7-00be-4305-909c-ab9245b60e1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.966562] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance fece5f33-93ed-4202-8cd0-637924929ee4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.966758] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a2fd776e-9a01-4b67-bc23-1605d6e2b23e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.966834] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.967019] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.967344] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1b5b8be5-7e9c-4269-994a-e54aeb75774f actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.967563] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b22ae1a7-c9b8-464b-a81c-73144a0176be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 783.995118] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.034025] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.047391] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.061539] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b017d568-1ad8-4d8d-84e8-5771341389bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.072934] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ee6c409a-0d32-48fa-a873-b9b62040aef7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.084738] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 7db91c79-1cdb-4101-a369-583b8bbae870 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.110409] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e1027e0e-7938-4772-84c2-f879e9ce4144 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.128097] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3963518a-23de-434e-9f88-392a80daf120 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.140730] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b462ac2e-d668-4ac1-a6b3-2cfb49a2e0cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.175617] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8870c8cf-bf83-482d-91a9-47fdedc79586 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.190650] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance bb8b7561-424e-48ba-9faa-65d6f6465a20 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.205267] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d6893024-9531-435b-8893-38f310224d7b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.238939] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 0d6c2ea5-71ff-49bb-ae23-3323e9b4a3f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.252353] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 72a209af-5976-4943-9752-8c258bb24158 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.262723] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 21a172f7-20d4-4f17-af4d-cadc0fa33c1d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.273533] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.286101] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e7543070-519f-470d-b3dd-964b60ce149f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.298964] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance de7e2949-00a0-4ce7-9a54-c678d8722464 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.314315] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4a59ec41-924b-4eb0-a025-4820479d535b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 784.314583] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 11 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 784.314780] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1920MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=11 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 784.806614] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7d01bae-3038-432e-a87c-8c85c816783b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.816043] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5460407c-b43b-4cb4-81c3-698df1c8880c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.847968] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3336337c-5341-4b7d-932f-38ce7181c52e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.855857] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62eea09b-218c-494d-990c-d250ca35fc25 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.869948] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 784.884233] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 784.913963] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 784.914191] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.055s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 785.917574] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 785.917857] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 785.917910] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 785.958540] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 785.958742] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 785.958968] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 785.959062] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 785.959155] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 785.960170] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 785.960412] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 785.960626] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 785.960680] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 785.960846] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 786.008455] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 786.008625] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquired lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 786.008780] env[67977]: DEBUG nova.network.neutron [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Forcefully refreshing network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2004}} [ 786.008963] env[67977]: DEBUG nova.objects.instance [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lazy-loading 'info_cache' on Instance uuid 1b5b8be5-7e9c-4269-994a-e54aeb75774f {{(pid=67977) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 786.057103] env[67977]: DEBUG oslo_concurrency.lockutils [None req-59c6d9d3-7fb4-45f9-9210-876f7af9fe8b tempest-ServersTestFqdnHostnames-1425710566 tempest-ServersTestFqdnHostnames-1425710566-project-member] Acquiring lock "2a2e7c1d-af91-48c8-bbbf-3265d7407bb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 786.057343] env[67977]: DEBUG oslo_concurrency.lockutils [None req-59c6d9d3-7fb4-45f9-9210-876f7af9fe8b tempest-ServersTestFqdnHostnames-1425710566 tempest-ServersTestFqdnHostnames-1425710566-project-member] Lock "2a2e7c1d-af91-48c8-bbbf-3265d7407bb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 786.472905] env[67977]: DEBUG nova.network.neutron [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Updating instance_info_cache with network_info: [{"id": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "address": "fa:16:3e:bf:5b:ef", "network": {"id": "b3179acf-cc67-477c-b095-6ec83690b6ae", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1388201138-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.150", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "570bfe0d22ef4aacb477b8cf01505918", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9c1b8689-a9b4-4972-beb9-6a1c8de1dc88", "external-id": "nsx-vlan-transportzone-455", "segmentation_id": 455, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap84523293-e6", "ovs_interfaceid": "84523293-e6ef-40e3-9c1a-8eefa9c39d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 786.483905] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Releasing lock "refresh_cache-1b5b8be5-7e9c-4269-994a-e54aeb75774f" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 786.484165] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Updated the network info_cache for instance {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9982}} [ 786.484392] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 786.484834] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 786.484991] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 788.462502] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c3db2a6d-b2be-47da-8d68-cc9b85fd7cb1 tempest-ServerActionsTestJSON-853799719 tempest-ServerActionsTestJSON-853799719-project-member] Acquiring lock "e0c3bec9-6a83-4104-87db-673f90fb1247" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 788.462876] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c3db2a6d-b2be-47da-8d68-cc9b85fd7cb1 tempest-ServerActionsTestJSON-853799719 tempest-ServerActionsTestJSON-853799719-project-member] Lock "e0c3bec9-6a83-4104-87db-673f90fb1247" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 790.734391] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9bc73d1f-a3df-4044-9749-fad3209d280d tempest-ServerActionsV293TestJSON-1366117977 tempest-ServerActionsV293TestJSON-1366117977-project-member] Acquiring lock "05ea43b1-42c7-464b-89c9-b405f7ba20da" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 790.734668] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9bc73d1f-a3df-4044-9749-fad3209d280d tempest-ServerActionsV293TestJSON-1366117977 tempest-ServerActionsV293TestJSON-1366117977-project-member] Lock "05ea43b1-42c7-464b-89c9-b405f7ba20da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 793.118724] env[67977]: DEBUG oslo_concurrency.lockutils [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Acquiring lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 793.119054] env[67977]: DEBUG oslo_concurrency.lockutils [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 793.119189] env[67977]: DEBUG oslo_concurrency.lockutils [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Acquiring lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 793.119366] env[67977]: DEBUG oslo_concurrency.lockutils [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 793.120330] env[67977]: DEBUG oslo_concurrency.lockutils [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 793.122068] env[67977]: INFO nova.compute.manager [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Terminating instance [ 793.124195] env[67977]: DEBUG nova.compute.manager [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 793.124774] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Powering off the VM {{(pid=67977) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 793.124861] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-8e47cf25-6e03-40e4-97cf-137ca204b14c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.132259] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for the task: (returnval){ [ 793.132259] env[67977]: value = "task-3468143" [ 793.132259] env[67977]: _type = "Task" [ 793.132259] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 793.140924] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468143, 'name': PowerOffVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 793.642082] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468143, 'name': PowerOffVM_Task, 'duration_secs': 0.165942} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 793.642639] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Powered off the VM {{(pid=67977) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 793.642841] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Volume detach. Driver type: vmdk {{(pid=67977) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 793.643057] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693037', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'name': 'volume-f48842bb-527b-4951-a92b-36a75954ad22', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1b5b8be5-7e9c-4269-994a-e54aeb75774f', 'attached_at': '', 'detached_at': '', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'serial': 'f48842bb-527b-4951-a92b-36a75954ad22'} {{(pid=67977) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 793.643797] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80d3e3d9-9ba7-4f15-b415-539dfd0199d7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.661387] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf6ee502-99b0-4682-9fc2-5cf973e81ea9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.668042] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfa06040-a5f5-4a99-85c4-aea1a8695277 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.688840] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c174720e-588d-46a1-a011-dc391af6e776 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.705250] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] The volume has not been displaced from its original location: [datastore1] volume-f48842bb-527b-4951-a92b-36a75954ad22/volume-f48842bb-527b-4951-a92b-36a75954ad22.vmdk. No consolidation needed. {{(pid=67977) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 793.710434] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Reconfiguring VM instance instance-0000000b to detach disk 2000 {{(pid=67977) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 793.710731] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-fd743301-a918-40fb-9e65-5a88dcae9076 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.728557] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for the task: (returnval){ [ 793.728557] env[67977]: value = "task-3468144" [ 793.728557] env[67977]: _type = "Task" [ 793.728557] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 793.736393] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468144, 'name': ReconfigVM_Task} progress is 6%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 794.238477] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468144, 'name': ReconfigVM_Task, 'duration_secs': 0.149806} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 794.238768] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Reconfigured VM instance instance-0000000b to detach disk 2000 {{(pid=67977) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 794.243380] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-f95cf8fc-4766-467e-a956-bc44e02ce684 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.258149] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for the task: (returnval){ [ 794.258149] env[67977]: value = "task-3468145" [ 794.258149] env[67977]: _type = "Task" [ 794.258149] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 794.265937] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468145, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 794.768537] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468145, 'name': ReconfigVM_Task, 'duration_secs': 0.161372} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 794.768815] env[67977]: DEBUG nova.virt.vmwareapi.volumeops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693037', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'name': 'volume-f48842bb-527b-4951-a92b-36a75954ad22', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1b5b8be5-7e9c-4269-994a-e54aeb75774f', 'attached_at': '', 'detached_at': '', 'volume_id': 'f48842bb-527b-4951-a92b-36a75954ad22', 'serial': 'f48842bb-527b-4951-a92b-36a75954ad22'} {{(pid=67977) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 794.769200] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 794.769914] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0760382d-c50c-4ba3-8c14-e8398059a35e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.776648] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 794.776751] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e9dc0535-bef4-4e16-b603-ef9cf27268e5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.830733] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 794.830965] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 794.831184] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Deleting the datastore file [datastore1] 1b5b8be5-7e9c-4269-994a-e54aeb75774f {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 794.831450] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0241309e-a1f5-45fa-8933-02e6043f1072 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.837531] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for the task: (returnval){ [ 794.837531] env[67977]: value = "task-3468147" [ 794.837531] env[67977]: _type = "Task" [ 794.837531] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 794.844700] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468147, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 795.346869] env[67977]: DEBUG oslo_vmware.api [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Task: {'id': task-3468147, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080142} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 795.347196] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 795.347306] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 795.347479] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 795.347652] env[67977]: INFO nova.compute.manager [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Took 2.22 seconds to destroy the instance on the hypervisor. [ 795.347886] env[67977]: DEBUG oslo.service.loopingcall [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 795.348086] env[67977]: DEBUG nova.compute.manager [-] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 795.348186] env[67977]: DEBUG nova.network.neutron [-] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 796.303028] env[67977]: DEBUG nova.network.neutron [-] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 796.314888] env[67977]: INFO nova.compute.manager [-] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Took 0.97 seconds to deallocate network for instance. [ 796.396133] env[67977]: INFO nova.compute.manager [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Took 0.08 seconds to detach 1 volumes for instance. [ 796.398541] env[67977]: DEBUG nova.compute.manager [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Deleting volume: f48842bb-527b-4951-a92b-36a75954ad22 {{(pid=67977) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3221}} [ 796.600495] env[67977]: DEBUG nova.compute.manager [req-06860a35-e3b4-4ce9-b01f-919ab872c826 req-a4353b02-be16-4f48-9dbb-6b1bb3cc736d service nova] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Received event network-vif-deleted-84523293-e6ef-40e3-9c1a-8eefa9c39d13 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 796.637877] env[67977]: DEBUG oslo_concurrency.lockutils [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 796.638428] env[67977]: DEBUG oslo_concurrency.lockutils [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 796.638428] env[67977]: DEBUG nova.objects.instance [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lazy-loading 'resources' on Instance uuid 1b5b8be5-7e9c-4269-994a-e54aeb75774f {{(pid=67977) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 797.140370] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a8901ef-6b76-4d2e-b969-a9bfdb4893b1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.149551] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3e34091-a917-46bf-a5e8-fbb1ff971e3e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.186066] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a079dd13-ea52-4b1a-952d-83459adb5ad3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.194190] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fc00bf6-29ad-4d0e-9b41-d871fdb9cbf3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.209369] env[67977]: DEBUG nova.compute.provider_tree [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 797.218889] env[67977]: DEBUG nova.scheduler.client.report [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 797.235059] env[67977]: DEBUG oslo_concurrency.lockutils [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.597s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 797.256556] env[67977]: INFO nova.scheduler.client.report [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Deleted allocations for instance 1b5b8be5-7e9c-4269-994a-e54aeb75774f [ 797.313919] env[67977]: DEBUG oslo_concurrency.lockutils [None req-aa278c12-57b7-4c90-ae44-d74eca30891c tempest-ServersTestBootFromVolume-1529388701 tempest-ServersTestBootFromVolume-1529388701-project-member] Lock "1b5b8be5-7e9c-4269-994a-e54aeb75774f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 4.195s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 805.749668] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7913bbbb-62d1-4049-8a13-2a6ce94d16f8 tempest-ServersV294TestFqdnHostnames-690290236 tempest-ServersV294TestFqdnHostnames-690290236-project-member] Acquiring lock "20642d86-67cd-41ee-ac01-d59fcb5d6243" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 805.749937] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7913bbbb-62d1-4049-8a13-2a6ce94d16f8 tempest-ServersV294TestFqdnHostnames-690290236 tempest-ServersV294TestFqdnHostnames-690290236-project-member] Lock "20642d86-67cd-41ee-ac01-d59fcb5d6243" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.825401] env[67977]: WARNING oslo_vmware.rw_handles [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 812.825401] env[67977]: ERROR oslo_vmware.rw_handles [ 812.825952] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/c98d0374-dc9e-48a1-b097-59763559e059/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 812.827418] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 812.827666] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Copying Virtual Disk [datastore1] vmware_temp/c98d0374-dc9e-48a1-b097-59763559e059/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/c98d0374-dc9e-48a1-b097-59763559e059/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 812.827938] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-464c4e0b-3f63-4ce6-9d6b-4df73106db12 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.837046] env[67977]: DEBUG oslo_vmware.api [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Waiting for the task: (returnval){ [ 812.837046] env[67977]: value = "task-3468149" [ 812.837046] env[67977]: _type = "Task" [ 812.837046] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 812.845010] env[67977]: DEBUG oslo_vmware.api [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Task: {'id': task-3468149, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 813.348480] env[67977]: DEBUG oslo_vmware.exceptions [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 813.348806] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 813.349392] env[67977]: ERROR nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 813.349392] env[67977]: Faults: ['InvalidArgument'] [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Traceback (most recent call last): [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] yield resources [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] self.driver.spawn(context, instance, image_meta, [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] self._fetch_image_if_missing(context, vi) [ 813.349392] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] image_cache(vi, tmp_image_ds_loc) [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] vm_util.copy_virtual_disk( [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] session._wait_for_task(vmdk_copy_task) [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] return self.wait_for_task(task_ref) [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] return evt.wait() [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] result = hub.switch() [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 813.349651] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] return self.greenlet.switch() [ 813.349908] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 813.349908] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] self.f(*self.args, **self.kw) [ 813.349908] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 813.349908] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] raise exceptions.translate_fault(task_info.error) [ 813.349908] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 813.349908] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Faults: ['InvalidArgument'] [ 813.349908] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] [ 813.349908] env[67977]: INFO nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Terminating instance [ 813.352088] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquiring lock "refresh_cache-fece5f33-93ed-4202-8cd0-637924929ee4" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 813.352201] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquired lock "refresh_cache-fece5f33-93ed-4202-8cd0-637924929ee4" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 813.352391] env[67977]: DEBUG nova.network.neutron [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 813.353964] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 813.353964] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 813.353964] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9497968a-570f-4307-ad8d-405f3c78a4f3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.365578] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 813.365578] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 813.366301] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eba13b00-5138-4902-a911-c2d2736c6cf1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.372736] env[67977]: DEBUG oslo_vmware.api [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Waiting for the task: (returnval){ [ 813.372736] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52a73008-b69f-6d3a-7f34-6db1574070f3" [ 813.372736] env[67977]: _type = "Task" [ 813.372736] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 813.382963] env[67977]: DEBUG oslo_vmware.api [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52a73008-b69f-6d3a-7f34-6db1574070f3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 813.385832] env[67977]: DEBUG nova.network.neutron [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 813.486354] env[67977]: DEBUG nova.network.neutron [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 813.495157] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Releasing lock "refresh_cache-fece5f33-93ed-4202-8cd0-637924929ee4" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 813.495552] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 813.495744] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 813.497072] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aeedf1a9-100e-4f26-ba8e-e3eeca4e72eb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.505060] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 813.505293] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-097f4ed3-5e1d-452f-801a-a7501db0e553 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.533542] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 813.533690] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 813.533872] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Deleting the datastore file [datastore1] fece5f33-93ed-4202-8cd0-637924929ee4 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 813.534176] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6b2b47a8-d6e3-49a3-9f90-fabbb801ff0c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.541367] env[67977]: DEBUG oslo_vmware.api [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Waiting for the task: (returnval){ [ 813.541367] env[67977]: value = "task-3468151" [ 813.541367] env[67977]: _type = "Task" [ 813.541367] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 813.549019] env[67977]: DEBUG oslo_vmware.api [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Task: {'id': task-3468151, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 813.882220] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 813.882549] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Creating directory with path [datastore1] vmware_temp/99ca9361-c3c8-4a86-af43-798586bdd23e/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 813.882750] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5ee2228c-727a-4191-8087-91cbc9cca747 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.894032] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Created directory with path [datastore1] vmware_temp/99ca9361-c3c8-4a86-af43-798586bdd23e/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 813.894161] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Fetch image to [datastore1] vmware_temp/99ca9361-c3c8-4a86-af43-798586bdd23e/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 813.894334] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/99ca9361-c3c8-4a86-af43-798586bdd23e/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 813.895078] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9180a233-076f-4828-b145-8ee301f864d8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.901750] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69f744c3-c103-4710-8eb7-cfba9b06e435 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.910490] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47723e49-affb-438e-8ea6-60570b7b0950 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.941400] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab5b388d-bdba-4889-b7cd-038fd061e9c4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.947135] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-026b877f-31a8-4706-a9f5-204ed5c3a7d6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.979204] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 814.028039] env[67977]: DEBUG oslo_vmware.rw_handles [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/99ca9361-c3c8-4a86-af43-798586bdd23e/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 814.088013] env[67977]: DEBUG oslo_vmware.rw_handles [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 814.088288] env[67977]: DEBUG oslo_vmware.rw_handles [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/99ca9361-c3c8-4a86-af43-798586bdd23e/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 814.092390] env[67977]: DEBUG oslo_vmware.api [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Task: {'id': task-3468151, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.040493} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 814.092668] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 814.092881] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 814.093079] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 814.093323] env[67977]: INFO nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 814.093496] env[67977]: DEBUG oslo.service.loopingcall [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 814.093700] env[67977]: DEBUG nova.compute.manager [-] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Skipping network deallocation for instance since networking was not requested. {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 814.096739] env[67977]: DEBUG nova.compute.claims [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 814.096931] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 814.097176] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 814.552784] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05f7a15a-3672-4e0c-9eec-7c55639b2932 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.560584] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd4fbb88-ae69-4c2e-8659-7ea0c9496198 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.590229] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29f03c07-2328-4bcd-9ecf-d0875f11c95c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.596820] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d46109d-eb40-4960-8cc2-3dd215629f29 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.609915] env[67977]: DEBUG nova.compute.provider_tree [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 814.619113] env[67977]: DEBUG nova.scheduler.client.report [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 814.634432] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.537s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 814.634963] env[67977]: ERROR nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 814.634963] env[67977]: Faults: ['InvalidArgument'] [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Traceback (most recent call last): [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] self.driver.spawn(context, instance, image_meta, [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] self._fetch_image_if_missing(context, vi) [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] image_cache(vi, tmp_image_ds_loc) [ 814.634963] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] vm_util.copy_virtual_disk( [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] session._wait_for_task(vmdk_copy_task) [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] return self.wait_for_task(task_ref) [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] return evt.wait() [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] result = hub.switch() [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] return self.greenlet.switch() [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 814.635312] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] self.f(*self.args, **self.kw) [ 814.635623] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 814.635623] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] raise exceptions.translate_fault(task_info.error) [ 814.635623] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 814.635623] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Faults: ['InvalidArgument'] [ 814.635623] env[67977]: ERROR nova.compute.manager [instance: fece5f33-93ed-4202-8cd0-637924929ee4] [ 814.635747] env[67977]: DEBUG nova.compute.utils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 814.637104] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Build of instance fece5f33-93ed-4202-8cd0-637924929ee4 was re-scheduled: A specified parameter was not correct: fileType [ 814.637104] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 814.637478] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 814.637745] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquiring lock "refresh_cache-fece5f33-93ed-4202-8cd0-637924929ee4" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 814.637849] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Acquired lock "refresh_cache-fece5f33-93ed-4202-8cd0-637924929ee4" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 814.638011] env[67977]: DEBUG nova.network.neutron [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 814.665416] env[67977]: DEBUG nova.network.neutron [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 814.767689] env[67977]: DEBUG nova.network.neutron [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 814.776962] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Releasing lock "refresh_cache-fece5f33-93ed-4202-8cd0-637924929ee4" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 814.777250] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 814.777435] env[67977]: DEBUG nova.compute.manager [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] [instance: fece5f33-93ed-4202-8cd0-637924929ee4] Skipping network deallocation for instance since networking was not requested. {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 814.875622] env[67977]: INFO nova.scheduler.client.report [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Deleted allocations for instance fece5f33-93ed-4202-8cd0-637924929ee4 [ 814.895902] env[67977]: DEBUG oslo_concurrency.lockutils [None req-70486866-d695-4316-ae9a-67cb7627579d tempest-ServerDiagnosticsV248Test-883628586 tempest-ServerDiagnosticsV248Test-883628586-project-member] Lock "fece5f33-93ed-4202-8cd0-637924929ee4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 97.815s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 814.907572] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 814.958806] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 814.959125] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 814.960761] env[67977]: INFO nova.compute.claims [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 815.348686] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ed1c662-3275-444e-96db-d20410b2e117 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.357832] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8190c324-2b34-4264-a912-6d10fa1aea0c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.387834] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91bb8288-028f-48e7-998f-cb6b45a03ea8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.395132] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76a48937-53cc-4772-ae6e-4dac1c1c3d92 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.410539] env[67977]: DEBUG nova.compute.provider_tree [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 815.418044] env[67977]: DEBUG nova.scheduler.client.report [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 815.434416] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.475s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 815.434927] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 815.471169] env[67977]: DEBUG nova.compute.utils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 815.473051] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 815.473051] env[67977]: DEBUG nova.network.neutron [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 815.482047] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 815.546022] env[67977]: DEBUG nova.policy [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d299fc53ea946b1b92499cd476a957b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6a06792002eb492bb578fc57dd4c0e0f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 815.559856] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 815.587033] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 815.587289] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 815.587448] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 815.587634] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 815.587781] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 815.587996] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 815.588270] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 815.588395] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 815.588639] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 815.588711] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 815.588912] env[67977]: DEBUG nova.virt.hardware [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 815.589779] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9660127a-b548-4f15-af76-c64154b71456 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.600665] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ac18b75-9029-4237-a774-4238a62167fe {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.023038] env[67977]: DEBUG nova.network.neutron [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Successfully created port: 73c18f38-e9c2-4a2a-a39c-242f923d69ce {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 816.871098] env[67977]: DEBUG nova.network.neutron [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Successfully updated port: 73c18f38-e9c2-4a2a-a39c-242f923d69ce {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 816.880855] env[67977]: DEBUG nova.compute.manager [req-1dd3f3e2-9c1f-453c-8ca4-01ff748672a7 req-ef00f4c5-dc30-483a-af0e-012128a35f2d service nova] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Received event network-vif-plugged-73c18f38-e9c2-4a2a-a39c-242f923d69ce {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 816.880855] env[67977]: DEBUG oslo_concurrency.lockutils [req-1dd3f3e2-9c1f-453c-8ca4-01ff748672a7 req-ef00f4c5-dc30-483a-af0e-012128a35f2d service nova] Acquiring lock "83b04c8c-39f6-4f58-b965-0a94c063b68b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 816.880920] env[67977]: DEBUG oslo_concurrency.lockutils [req-1dd3f3e2-9c1f-453c-8ca4-01ff748672a7 req-ef00f4c5-dc30-483a-af0e-012128a35f2d service nova] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 816.881077] env[67977]: DEBUG oslo_concurrency.lockutils [req-1dd3f3e2-9c1f-453c-8ca4-01ff748672a7 req-ef00f4c5-dc30-483a-af0e-012128a35f2d service nova] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 816.881521] env[67977]: DEBUG nova.compute.manager [req-1dd3f3e2-9c1f-453c-8ca4-01ff748672a7 req-ef00f4c5-dc30-483a-af0e-012128a35f2d service nova] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] No waiting events found dispatching network-vif-plugged-73c18f38-e9c2-4a2a-a39c-242f923d69ce {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 816.881521] env[67977]: WARNING nova.compute.manager [req-1dd3f3e2-9c1f-453c-8ca4-01ff748672a7 req-ef00f4c5-dc30-483a-af0e-012128a35f2d service nova] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Received unexpected event network-vif-plugged-73c18f38-e9c2-4a2a-a39c-242f923d69ce for instance with vm_state building and task_state spawning. [ 816.889726] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquiring lock "refresh_cache-83b04c8c-39f6-4f58-b965-0a94c063b68b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 816.889726] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquired lock "refresh_cache-83b04c8c-39f6-4f58-b965-0a94c063b68b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 816.889726] env[67977]: DEBUG nova.network.neutron [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 816.953954] env[67977]: DEBUG nova.network.neutron [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 817.310117] env[67977]: DEBUG nova.network.neutron [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Updating instance_info_cache with network_info: [{"id": "73c18f38-e9c2-4a2a-a39c-242f923d69ce", "address": "fa:16:3e:f7:1f:05", "network": {"id": "942fcb5b-5a79-4f3c-a103-f5a50a2b0f99", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-287880618-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6a06792002eb492bb578fc57dd4c0e0f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60badc2d-69d2-467d-a92e-98511f5cb0b2", "external-id": "cl2-zone-408", "segmentation_id": 408, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73c18f38-e9", "ovs_interfaceid": "73c18f38-e9c2-4a2a-a39c-242f923d69ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 817.328550] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Releasing lock "refresh_cache-83b04c8c-39f6-4f58-b965-0a94c063b68b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 817.328869] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Instance network_info: |[{"id": "73c18f38-e9c2-4a2a-a39c-242f923d69ce", "address": "fa:16:3e:f7:1f:05", "network": {"id": "942fcb5b-5a79-4f3c-a103-f5a50a2b0f99", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-287880618-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6a06792002eb492bb578fc57dd4c0e0f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60badc2d-69d2-467d-a92e-98511f5cb0b2", "external-id": "cl2-zone-408", "segmentation_id": 408, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73c18f38-e9", "ovs_interfaceid": "73c18f38-e9c2-4a2a-a39c-242f923d69ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 817.329872] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f7:1f:05', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '60badc2d-69d2-467d-a92e-98511f5cb0b2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '73c18f38-e9c2-4a2a-a39c-242f923d69ce', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 817.337669] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Creating folder: Project (6a06792002eb492bb578fc57dd4c0e0f). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 817.338269] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-86da952c-7c94-471d-a183-c6db9ad2915d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.349594] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Created folder: Project (6a06792002eb492bb578fc57dd4c0e0f) in parent group-v693022. [ 817.349834] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Creating folder: Instances. Parent ref: group-v693064. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 817.350043] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-decfd70b-dcf3-408a-aeec-7ab1960e505e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.358713] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Created folder: Instances in parent group-v693064. [ 817.358942] env[67977]: DEBUG oslo.service.loopingcall [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 817.359138] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 817.359910] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-53d7621b-916b-4bcd-9ac0-718887e3bde2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.378585] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 817.378585] env[67977]: value = "task-3468154" [ 817.378585] env[67977]: _type = "Task" [ 817.378585] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 817.387323] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468154, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 817.895852] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468154, 'name': CreateVM_Task, 'duration_secs': 0.279121} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 817.896478] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 817.897251] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 817.897492] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 817.897889] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 817.898210] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9e2ed487-a1b8-419d-b76b-087efb7390fd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.902694] env[67977]: DEBUG oslo_vmware.api [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Waiting for the task: (returnval){ [ 817.902694] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52a91552-07e7-bfa8-36d6-0517b6f27e0b" [ 817.902694] env[67977]: _type = "Task" [ 817.902694] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 817.912620] env[67977]: DEBUG oslo_vmware.api [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52a91552-07e7-bfa8-36d6-0517b6f27e0b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 818.413106] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 818.413419] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 818.413644] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 819.258831] env[67977]: DEBUG nova.compute.manager [req-b0cad6fd-e9e3-4e46-b657-97acb58eb56d req-c4108567-8075-45bc-af67-5c4b209be7b5 service nova] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Received event network-changed-73c18f38-e9c2-4a2a-a39c-242f923d69ce {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 819.259084] env[67977]: DEBUG nova.compute.manager [req-b0cad6fd-e9e3-4e46-b657-97acb58eb56d req-c4108567-8075-45bc-af67-5c4b209be7b5 service nova] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Refreshing instance network info cache due to event network-changed-73c18f38-e9c2-4a2a-a39c-242f923d69ce. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 819.259372] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0cad6fd-e9e3-4e46-b657-97acb58eb56d req-c4108567-8075-45bc-af67-5c4b209be7b5 service nova] Acquiring lock "refresh_cache-83b04c8c-39f6-4f58-b965-0a94c063b68b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 819.259587] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0cad6fd-e9e3-4e46-b657-97acb58eb56d req-c4108567-8075-45bc-af67-5c4b209be7b5 service nova] Acquired lock "refresh_cache-83b04c8c-39f6-4f58-b965-0a94c063b68b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 819.259789] env[67977]: DEBUG nova.network.neutron [req-b0cad6fd-e9e3-4e46-b657-97acb58eb56d req-c4108567-8075-45bc-af67-5c4b209be7b5 service nova] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Refreshing network info cache for port 73c18f38-e9c2-4a2a-a39c-242f923d69ce {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 819.684067] env[67977]: DEBUG nova.network.neutron [req-b0cad6fd-e9e3-4e46-b657-97acb58eb56d req-c4108567-8075-45bc-af67-5c4b209be7b5 service nova] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Updated VIF entry in instance network info cache for port 73c18f38-e9c2-4a2a-a39c-242f923d69ce. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 819.684344] env[67977]: DEBUG nova.network.neutron [req-b0cad6fd-e9e3-4e46-b657-97acb58eb56d req-c4108567-8075-45bc-af67-5c4b209be7b5 service nova] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Updating instance_info_cache with network_info: [{"id": "73c18f38-e9c2-4a2a-a39c-242f923d69ce", "address": "fa:16:3e:f7:1f:05", "network": {"id": "942fcb5b-5a79-4f3c-a103-f5a50a2b0f99", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-287880618-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6a06792002eb492bb578fc57dd4c0e0f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60badc2d-69d2-467d-a92e-98511f5cb0b2", "external-id": "cl2-zone-408", "segmentation_id": 408, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73c18f38-e9", "ovs_interfaceid": "73c18f38-e9c2-4a2a-a39c-242f923d69ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 819.694396] env[67977]: DEBUG oslo_concurrency.lockutils [req-b0cad6fd-e9e3-4e46-b657-97acb58eb56d req-c4108567-8075-45bc-af67-5c4b209be7b5 service nova] Releasing lock "refresh_cache-83b04c8c-39f6-4f58-b965-0a94c063b68b" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 821.614407] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquiring lock "6e2f1b5e-7bdc-463d-9822-810f99b81623" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 821.614407] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 843.775586] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 843.775866] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 843.776051] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 843.787760] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 843.787983] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 843.788171] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 843.788329] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 843.789453] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f076582-9f38-4974-9743-bdc3d47cf007 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.798308] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6daabaf-dbae-458f-87ef-cd992719ddc1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.814033] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec7bd7fe-0a17-4900-afed-ba3ba4597ec4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.820189] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-274d1a1e-62de-424d-806d-fa8491f3feff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.849701] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180761MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 843.849888] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 843.850077] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 843.929431] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f6e698af-6d7e-40d5-988b-450f300b67a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.929592] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 04e59d76-a2d5-482c-90a0-fcb407c0bd4e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.929721] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b623a2f1-404e-4f48-aeb2-ebb372260a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.929844] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance faf24c4e-135e-47df-85a6-05024bc9b64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.929964] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 02dea9f7-00be-4305-909c-ab9245b60e1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.930096] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a2fd776e-9a01-4b67-bc23-1605d6e2b23e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.930217] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.930331] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.930447] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b22ae1a7-c9b8-464b-a81c-73144a0176be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.930559] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 843.943641] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 843.955096] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 843.965617] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b017d568-1ad8-4d8d-84e8-5771341389bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 843.977017] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ee6c409a-0d32-48fa-a873-b9b62040aef7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 843.987794] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 7db91c79-1cdb-4101-a369-583b8bbae870 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 843.999087] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e1027e0e-7938-4772-84c2-f879e9ce4144 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.011197] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3963518a-23de-434e-9f88-392a80daf120 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.021348] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b462ac2e-d668-4ac1-a6b3-2cfb49a2e0cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.032980] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8870c8cf-bf83-482d-91a9-47fdedc79586 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.045626] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance bb8b7561-424e-48ba-9faa-65d6f6465a20 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.057911] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d6893024-9531-435b-8893-38f310224d7b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.072202] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 0d6c2ea5-71ff-49bb-ae23-3323e9b4a3f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.084426] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 72a209af-5976-4943-9752-8c258bb24158 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.095412] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 21a172f7-20d4-4f17-af4d-cadc0fa33c1d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.106189] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.117966] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e7543070-519f-470d-b3dd-964b60ce149f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.128559] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance de7e2949-00a0-4ce7-9a54-c678d8722464 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.140549] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4a59ec41-924b-4eb0-a025-4820479d535b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.154856] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2a2e7c1d-af91-48c8-bbbf-3265d7407bb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.167155] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e0c3bec9-6a83-4104-87db-673f90fb1247 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.180850] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 05ea43b1-42c7-464b-89c9-b405f7ba20da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.192744] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 20642d86-67cd-41ee-ac01-d59fcb5d6243 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.204486] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 844.204799] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 844.205030] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 844.603850] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7b882cf-7c77-4b96-be2f-ab2e6d7392e2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 844.611474] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c81bb2d-39a9-4c4d-9573-bcb7f261e16f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 844.641154] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceedc236-4230-4fcd-812a-32e953214cc0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 844.648227] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2cfbfec-da64-48b6-9a0d-fd22d88ccffa {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 844.661292] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 844.670065] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 844.686605] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 844.686799] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.837s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 845.686290] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 845.686554] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 845.686701] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 845.686860] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 845.687011] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 846.774878] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 846.775154] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 846.775199] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 846.797836] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.798017] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.798160] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.798294] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.798422] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.798547] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.798675] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.798793] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.798913] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.799043] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 846.799168] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 846.799663] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 859.883583] env[67977]: WARNING oslo_vmware.rw_handles [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 859.883583] env[67977]: ERROR oslo_vmware.rw_handles [ 859.884118] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/99ca9361-c3c8-4a86-af43-798586bdd23e/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 859.885831] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 859.886130] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Copying Virtual Disk [datastore1] vmware_temp/99ca9361-c3c8-4a86-af43-798586bdd23e/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/99ca9361-c3c8-4a86-af43-798586bdd23e/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 859.886437] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f49c5396-f6e8-4407-bd6b-1950171b4883 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.894125] env[67977]: DEBUG oslo_vmware.api [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Waiting for the task: (returnval){ [ 859.894125] env[67977]: value = "task-3468155" [ 859.894125] env[67977]: _type = "Task" [ 859.894125] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 859.905604] env[67977]: DEBUG oslo_vmware.api [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Task: {'id': task-3468155, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 860.404446] env[67977]: DEBUG oslo_vmware.exceptions [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 860.404725] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 860.405309] env[67977]: ERROR nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 860.405309] env[67977]: Faults: ['InvalidArgument'] [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Traceback (most recent call last): [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] yield resources [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] self.driver.spawn(context, instance, image_meta, [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] self._fetch_image_if_missing(context, vi) [ 860.405309] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] image_cache(vi, tmp_image_ds_loc) [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] vm_util.copy_virtual_disk( [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] session._wait_for_task(vmdk_copy_task) [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] return self.wait_for_task(task_ref) [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] return evt.wait() [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] result = hub.switch() [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 860.405669] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] return self.greenlet.switch() [ 860.406115] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 860.406115] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] self.f(*self.args, **self.kw) [ 860.406115] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 860.406115] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] raise exceptions.translate_fault(task_info.error) [ 860.406115] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 860.406115] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Faults: ['InvalidArgument'] [ 860.406115] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] [ 860.406115] env[67977]: INFO nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Terminating instance [ 860.407217] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 860.407418] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 860.408252] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 860.408252] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 860.408466] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05a406e1-8351-4c37-9129-7a1e93c677ac {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.410736] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-390ef8d4-40d1-4034-a6f3-a83b9642f65b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.417353] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 860.417551] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f3490966-4ac7-438f-a1ff-8313420165e3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.419719] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 860.419890] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 860.420823] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-29a8fb3e-7cae-47cc-beb8-fa45396a8bc0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.425543] env[67977]: DEBUG oslo_vmware.api [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Waiting for the task: (returnval){ [ 860.425543] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52cb190e-bee5-e3d5-a1b2-23f8d3056649" [ 860.425543] env[67977]: _type = "Task" [ 860.425543] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 860.432696] env[67977]: DEBUG oslo_vmware.api [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52cb190e-bee5-e3d5-a1b2-23f8d3056649, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 860.490970] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 860.491234] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 860.491391] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Deleting the datastore file [datastore1] 04e59d76-a2d5-482c-90a0-fcb407c0bd4e {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 860.491662] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a0a70638-9978-4c53-8bb0-9abd5aa6cbb8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.498342] env[67977]: DEBUG oslo_vmware.api [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Waiting for the task: (returnval){ [ 860.498342] env[67977]: value = "task-3468157" [ 860.498342] env[67977]: _type = "Task" [ 860.498342] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 860.505925] env[67977]: DEBUG oslo_vmware.api [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Task: {'id': task-3468157, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 860.936096] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 860.936399] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Creating directory with path [datastore1] vmware_temp/2f91c858-166b-4834-b51d-1594c3c3f0b8/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 860.936606] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-760f3336-ca65-4291-8d41-582c68d8f45c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.952641] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Created directory with path [datastore1] vmware_temp/2f91c858-166b-4834-b51d-1594c3c3f0b8/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 860.952846] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Fetch image to [datastore1] vmware_temp/2f91c858-166b-4834-b51d-1594c3c3f0b8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 860.953031] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/2f91c858-166b-4834-b51d-1594c3c3f0b8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 860.954118] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f568df5-1efd-4d6e-9d93-bac86e2730b7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.960373] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5863541a-0d77-4ebe-a110-420926da026b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.969942] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d674f2ef-1bcd-475c-9cb5-ebf113f65082 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.003407] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-189d59bb-6da6-4052-82ba-328250b4e615 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.010534] env[67977]: DEBUG oslo_vmware.api [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Task: {'id': task-3468157, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075316} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 861.012090] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 861.012742] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 861.012742] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 861.012742] env[67977]: INFO nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 861.014408] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a57dfad7-a82d-4f20-9a47-29439b1ca133 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.016375] env[67977]: DEBUG nova.compute.claims [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 861.016552] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 861.017267] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 861.039439] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 861.089941] env[67977]: DEBUG oslo_vmware.rw_handles [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f91c858-166b-4834-b51d-1594c3c3f0b8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 861.150508] env[67977]: DEBUG oslo_vmware.rw_handles [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 861.150701] env[67977]: DEBUG oslo_vmware.rw_handles [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f91c858-166b-4834-b51d-1594c3c3f0b8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 861.482483] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6b219aa-def5-4990-a701-58ed7e619e14 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.489796] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf0349a5-bbc8-4bab-af83-28f35b0e3650 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.519181] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73ead985-c293-40ab-b3b3-9ecac7288da6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.526391] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4505e183-f023-440c-94dd-45edbb57e68f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.539218] env[67977]: DEBUG nova.compute.provider_tree [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 861.547244] env[67977]: DEBUG nova.scheduler.client.report [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 861.562338] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.545s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.562887] env[67977]: ERROR nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 861.562887] env[67977]: Faults: ['InvalidArgument'] [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Traceback (most recent call last): [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] self.driver.spawn(context, instance, image_meta, [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] self._fetch_image_if_missing(context, vi) [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] image_cache(vi, tmp_image_ds_loc) [ 861.562887] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] vm_util.copy_virtual_disk( [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] session._wait_for_task(vmdk_copy_task) [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] return self.wait_for_task(task_ref) [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] return evt.wait() [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] result = hub.switch() [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] return self.greenlet.switch() [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 861.563261] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] self.f(*self.args, **self.kw) [ 861.563622] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 861.563622] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] raise exceptions.translate_fault(task_info.error) [ 861.563622] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 861.563622] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Faults: ['InvalidArgument'] [ 861.563622] env[67977]: ERROR nova.compute.manager [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] [ 861.563763] env[67977]: DEBUG nova.compute.utils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 861.565098] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Build of instance 04e59d76-a2d5-482c-90a0-fcb407c0bd4e was re-scheduled: A specified parameter was not correct: fileType [ 861.565098] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 861.565557] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 861.565734] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 861.565895] env[67977]: DEBUG nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 861.566073] env[67977]: DEBUG nova.network.neutron [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 862.002224] env[67977]: DEBUG nova.network.neutron [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 862.015172] env[67977]: INFO nova.compute.manager [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] [instance: 04e59d76-a2d5-482c-90a0-fcb407c0bd4e] Took 0.45 seconds to deallocate network for instance. [ 862.110537] env[67977]: INFO nova.scheduler.client.report [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Deleted allocations for instance 04e59d76-a2d5-482c-90a0-fcb407c0bd4e [ 862.137855] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9d57661d-854e-4e2a-8aad-7e58fb16bad9 tempest-ServerDiagnosticsNegativeTest-1360345589 tempest-ServerDiagnosticsNegativeTest-1360345589-project-member] Lock "04e59d76-a2d5-482c-90a0-fcb407c0bd4e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 149.561s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 862.160245] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 862.219283] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 862.219535] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 862.221748] env[67977]: INFO nova.compute.claims [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 862.629461] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fba1902-da23-4d00-8ec5-ce4fa62eed46 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.637646] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26fd3d66-f19a-4f5b-bd0b-557b5ef8d420 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.667721] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eaf8651c-f841-4080-8b58-6aac0b2895b4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.674946] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3e9e71e-957d-4556-9486-3d676442e4c2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.688567] env[67977]: DEBUG nova.compute.provider_tree [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 862.700512] env[67977]: DEBUG nova.scheduler.client.report [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 862.720013] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.500s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 862.720512] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 862.759465] env[67977]: DEBUG nova.compute.utils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 862.761557] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 862.761747] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 862.770885] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 862.833219] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 862.857259] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 862.857507] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 862.857666] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 862.857849] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 862.857996] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 862.858279] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 862.858497] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 862.858659] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 862.858832] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 862.858996] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 862.859186] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 862.860042] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c44dd95f-7f45-4354-82f8-c5b57a3fc5a6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.863800] env[67977]: DEBUG nova.policy [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '47fe0af6e6a14b02860c64ad47acec73', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2bc90d7e60864418ab61c128ae20c558', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 862.870594] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efe670c1-1a03-4645-b2ab-46fcd426824f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.277686] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Successfully created port: 579a5c78-5531-4ed3-b711-bd335b6a4923 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 864.256673] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Successfully updated port: 579a5c78-5531-4ed3-b711-bd335b6a4923 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 864.280025] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "refresh_cache-e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 864.280273] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquired lock "refresh_cache-e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 864.280324] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 864.294399] env[67977]: DEBUG nova.compute.manager [req-98d2b5a1-c0a8-4443-a9e1-9e37348d0dc3 req-abab46d1-5364-421b-84f6-69c549a84f96 service nova] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Received event network-vif-plugged-579a5c78-5531-4ed3-b711-bd335b6a4923 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 864.296181] env[67977]: DEBUG oslo_concurrency.lockutils [req-98d2b5a1-c0a8-4443-a9e1-9e37348d0dc3 req-abab46d1-5364-421b-84f6-69c549a84f96 service nova] Acquiring lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 864.296181] env[67977]: DEBUG oslo_concurrency.lockutils [req-98d2b5a1-c0a8-4443-a9e1-9e37348d0dc3 req-abab46d1-5364-421b-84f6-69c549a84f96 service nova] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 864.296181] env[67977]: DEBUG oslo_concurrency.lockutils [req-98d2b5a1-c0a8-4443-a9e1-9e37348d0dc3 req-abab46d1-5364-421b-84f6-69c549a84f96 service nova] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 864.296181] env[67977]: DEBUG nova.compute.manager [req-98d2b5a1-c0a8-4443-a9e1-9e37348d0dc3 req-abab46d1-5364-421b-84f6-69c549a84f96 service nova] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] No waiting events found dispatching network-vif-plugged-579a5c78-5531-4ed3-b711-bd335b6a4923 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 864.296485] env[67977]: WARNING nova.compute.manager [req-98d2b5a1-c0a8-4443-a9e1-9e37348d0dc3 req-abab46d1-5364-421b-84f6-69c549a84f96 service nova] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Received unexpected event network-vif-plugged-579a5c78-5531-4ed3-b711-bd335b6a4923 for instance with vm_state building and task_state spawning. [ 864.328276] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 864.535105] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Updating instance_info_cache with network_info: [{"id": "579a5c78-5531-4ed3-b711-bd335b6a4923", "address": "fa:16:3e:e7:a8:18", "network": {"id": "4670e76b-e766-4136-a26c-e801979a2dfa", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-2063443269-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bc90d7e60864418ab61c128ae20c558", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "796c7fcb-00fd-4692-a44b-7ec550201e86", "external-id": "nsx-vlan-transportzone-42", "segmentation_id": 42, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap579a5c78-55", "ovs_interfaceid": "579a5c78-5531-4ed3-b711-bd335b6a4923", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 864.546858] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Releasing lock "refresh_cache-e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 864.547193] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Instance network_info: |[{"id": "579a5c78-5531-4ed3-b711-bd335b6a4923", "address": "fa:16:3e:e7:a8:18", "network": {"id": "4670e76b-e766-4136-a26c-e801979a2dfa", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-2063443269-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bc90d7e60864418ab61c128ae20c558", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "796c7fcb-00fd-4692-a44b-7ec550201e86", "external-id": "nsx-vlan-transportzone-42", "segmentation_id": 42, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap579a5c78-55", "ovs_interfaceid": "579a5c78-5531-4ed3-b711-bd335b6a4923", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 864.547588] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e7:a8:18', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '796c7fcb-00fd-4692-a44b-7ec550201e86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '579a5c78-5531-4ed3-b711-bd335b6a4923', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 864.558864] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Creating folder: Project (2bc90d7e60864418ab61c128ae20c558). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 864.560030] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-be05f9ab-51a4-4aff-9500-0d57ff89f84d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.571814] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Created folder: Project (2bc90d7e60864418ab61c128ae20c558) in parent group-v693022. [ 864.572031] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Creating folder: Instances. Parent ref: group-v693067. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 864.572281] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fe534b3a-2a46-4250-869f-61aa4c1647bb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.581987] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Created folder: Instances in parent group-v693067. [ 864.582440] env[67977]: DEBUG oslo.service.loopingcall [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 864.582631] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 864.582861] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a05178c0-0a4f-4b8c-9ac2-ad1ddedac98e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.605435] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 864.605435] env[67977]: value = "task-3468160" [ 864.605435] env[67977]: _type = "Task" [ 864.605435] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 864.614650] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468160, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 865.116626] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468160, 'name': CreateVM_Task, 'duration_secs': 0.403745} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 865.116839] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 865.117522] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 865.117828] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 865.118022] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 865.118265] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-090e8d27-a040-4f8f-8c25-10216cd376bd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.123137] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for the task: (returnval){ [ 865.123137] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52e49e16-4ed2-afc2-1086-9ca336df75b7" [ 865.123137] env[67977]: _type = "Task" [ 865.123137] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 865.130667] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52e49e16-4ed2-afc2-1086-9ca336df75b7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 865.636407] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 865.636407] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 865.636407] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 866.447727] env[67977]: DEBUG nova.compute.manager [req-04da557f-890d-4954-a75a-fb0eaaa2d7cb req-dc15a71b-c183-412d-92d9-6dd997b0c7e1 service nova] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Received event network-changed-579a5c78-5531-4ed3-b711-bd335b6a4923 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 866.447961] env[67977]: DEBUG nova.compute.manager [req-04da557f-890d-4954-a75a-fb0eaaa2d7cb req-dc15a71b-c183-412d-92d9-6dd997b0c7e1 service nova] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Refreshing instance network info cache due to event network-changed-579a5c78-5531-4ed3-b711-bd335b6a4923. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 866.448294] env[67977]: DEBUG oslo_concurrency.lockutils [req-04da557f-890d-4954-a75a-fb0eaaa2d7cb req-dc15a71b-c183-412d-92d9-6dd997b0c7e1 service nova] Acquiring lock "refresh_cache-e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 866.448379] env[67977]: DEBUG oslo_concurrency.lockutils [req-04da557f-890d-4954-a75a-fb0eaaa2d7cb req-dc15a71b-c183-412d-92d9-6dd997b0c7e1 service nova] Acquired lock "refresh_cache-e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 866.448573] env[67977]: DEBUG nova.network.neutron [req-04da557f-890d-4954-a75a-fb0eaaa2d7cb req-dc15a71b-c183-412d-92d9-6dd997b0c7e1 service nova] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Refreshing network info cache for port 579a5c78-5531-4ed3-b711-bd335b6a4923 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 867.188169] env[67977]: DEBUG nova.network.neutron [req-04da557f-890d-4954-a75a-fb0eaaa2d7cb req-dc15a71b-c183-412d-92d9-6dd997b0c7e1 service nova] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Updated VIF entry in instance network info cache for port 579a5c78-5531-4ed3-b711-bd335b6a4923. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 867.188764] env[67977]: DEBUG nova.network.neutron [req-04da557f-890d-4954-a75a-fb0eaaa2d7cb req-dc15a71b-c183-412d-92d9-6dd997b0c7e1 service nova] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Updating instance_info_cache with network_info: [{"id": "579a5c78-5531-4ed3-b711-bd335b6a4923", "address": "fa:16:3e:e7:a8:18", "network": {"id": "4670e76b-e766-4136-a26c-e801979a2dfa", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-2063443269-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bc90d7e60864418ab61c128ae20c558", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "796c7fcb-00fd-4692-a44b-7ec550201e86", "external-id": "nsx-vlan-transportzone-42", "segmentation_id": 42, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap579a5c78-55", "ovs_interfaceid": "579a5c78-5531-4ed3-b711-bd335b6a4923", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 867.201439] env[67977]: DEBUG oslo_concurrency.lockutils [req-04da557f-890d-4954-a75a-fb0eaaa2d7cb req-dc15a71b-c183-412d-92d9-6dd997b0c7e1 service nova] Releasing lock "refresh_cache-e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 869.178533] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquiring lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 869.178821] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 903.795392] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 904.774677] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 905.770355] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 905.775133] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 905.775353] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 905.775522] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 905.775667] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 905.775854] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 905.794046] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 905.794390] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 905.794577] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 905.794773] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 905.795947] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e589fe3-b562-43b5-b242-6f27f2b37496 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.804772] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21390d51-74bb-40bb-93f6-0f7cdf3e93c6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.829860] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e817c1ea-06f8-4cc0-a584-8db8fcb293c9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.837321] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5b83e8c-ccb6-4cb9-9f86-22f1b3c56ea8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.871269] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180929MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 905.871433] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 905.871629] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 905.952395] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f6e698af-6d7e-40d5-988b-450f300b67a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.952395] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b623a2f1-404e-4f48-aeb2-ebb372260a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.952395] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance faf24c4e-135e-47df-85a6-05024bc9b64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.952395] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 02dea9f7-00be-4305-909c-ab9245b60e1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.952596] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a2fd776e-9a01-4b67-bc23-1605d6e2b23e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.952596] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.952596] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.952852] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b22ae1a7-c9b8-464b-a81c-73144a0176be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.952956] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.953114] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 905.965442] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 905.979282] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b017d568-1ad8-4d8d-84e8-5771341389bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 905.989957] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ee6c409a-0d32-48fa-a873-b9b62040aef7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 905.999621] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 7db91c79-1cdb-4101-a369-583b8bbae870 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.010022] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e1027e0e-7938-4772-84c2-f879e9ce4144 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.019879] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3963518a-23de-434e-9f88-392a80daf120 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.029321] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b462ac2e-d668-4ac1-a6b3-2cfb49a2e0cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.038592] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8870c8cf-bf83-482d-91a9-47fdedc79586 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.048990] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance bb8b7561-424e-48ba-9faa-65d6f6465a20 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.058323] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d6893024-9531-435b-8893-38f310224d7b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.067963] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 0d6c2ea5-71ff-49bb-ae23-3323e9b4a3f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.077795] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 72a209af-5976-4943-9752-8c258bb24158 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.089485] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 21a172f7-20d4-4f17-af4d-cadc0fa33c1d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.098891] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.108325] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e7543070-519f-470d-b3dd-964b60ce149f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.118146] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance de7e2949-00a0-4ce7-9a54-c678d8722464 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.128143] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4a59ec41-924b-4eb0-a025-4820479d535b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.137618] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2a2e7c1d-af91-48c8-bbbf-3265d7407bb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.149423] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e0c3bec9-6a83-4104-87db-673f90fb1247 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.158920] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 05ea43b1-42c7-464b-89c9-b405f7ba20da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.168298] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 20642d86-67cd-41ee-ac01-d59fcb5d6243 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.177817] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.188061] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 906.189126] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 906.189126] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 906.584048] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-027479df-d79a-4856-9db5-679b16e57cdc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.591601] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aa2eaa0-4923-4d0c-9afa-c38cfc4d7422 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.620909] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fd070b6-13e2-468d-92ab-1dc34266ad64 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.628041] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26c03e22-e0b2-4b55-81c0-9638c9207558 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.641288] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 906.650613] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 906.663767] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 906.663954] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.792s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 907.663581] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 907.775130] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 907.862538] env[67977]: WARNING oslo_vmware.rw_handles [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 907.862538] env[67977]: ERROR oslo_vmware.rw_handles [ 907.862959] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/2f91c858-166b-4834-b51d-1594c3c3f0b8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 907.864589] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 907.864875] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Copying Virtual Disk [datastore1] vmware_temp/2f91c858-166b-4834-b51d-1594c3c3f0b8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/2f91c858-166b-4834-b51d-1594c3c3f0b8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 907.865181] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4de65dbe-a391-42bb-b20d-43dc93f4d34a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 907.873346] env[67977]: DEBUG oslo_vmware.api [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Waiting for the task: (returnval){ [ 907.873346] env[67977]: value = "task-3468161" [ 907.873346] env[67977]: _type = "Task" [ 907.873346] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 907.882432] env[67977]: DEBUG oslo_vmware.api [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Task: {'id': task-3468161, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 908.384749] env[67977]: DEBUG oslo_vmware.exceptions [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 908.385043] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 908.385632] env[67977]: ERROR nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 908.385632] env[67977]: Faults: ['InvalidArgument'] [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Traceback (most recent call last): [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] yield resources [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] self.driver.spawn(context, instance, image_meta, [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] self._fetch_image_if_missing(context, vi) [ 908.385632] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] image_cache(vi, tmp_image_ds_loc) [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] vm_util.copy_virtual_disk( [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] session._wait_for_task(vmdk_copy_task) [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] return self.wait_for_task(task_ref) [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] return evt.wait() [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] result = hub.switch() [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 908.386030] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] return self.greenlet.switch() [ 908.386411] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 908.386411] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] self.f(*self.args, **self.kw) [ 908.386411] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 908.386411] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] raise exceptions.translate_fault(task_info.error) [ 908.386411] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 908.386411] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Faults: ['InvalidArgument'] [ 908.386411] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] [ 908.386411] env[67977]: INFO nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Terminating instance [ 908.387925] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 908.387925] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 908.387925] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-164d3809-2d38-4d44-843c-6091dcf49e9e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.390177] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 908.390371] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 908.391098] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53a08673-f3d8-43c9-9dc1-2b74dc36fafe {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.398261] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 908.398462] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a2146650-ccff-4751-abdc-3f7e9e9a5ca0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.400675] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 908.400848] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 908.401813] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0967e7ab-10db-4f40-8e72-22aef89ca2ab {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.406241] env[67977]: DEBUG oslo_vmware.api [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Waiting for the task: (returnval){ [ 908.406241] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523a3f5a-9c5c-8785-8a06-4bc829775df2" [ 908.406241] env[67977]: _type = "Task" [ 908.406241] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 908.413338] env[67977]: DEBUG oslo_vmware.api [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523a3f5a-9c5c-8785-8a06-4bc829775df2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 908.463452] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 908.463671] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 908.463848] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Deleting the datastore file [datastore1] b623a2f1-404e-4f48-aeb2-ebb372260a86 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 908.464115] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3c40caa1-1881-4888-a46b-cf0435efd7b6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.470599] env[67977]: DEBUG oslo_vmware.api [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Waiting for the task: (returnval){ [ 908.470599] env[67977]: value = "task-3468163" [ 908.470599] env[67977]: _type = "Task" [ 908.470599] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 908.478090] env[67977]: DEBUG oslo_vmware.api [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Task: {'id': task-3468163, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 908.775948] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 908.775948] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 908.775948] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 908.797018] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797018] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797018] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797018] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797018] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797239] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797239] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797239] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797239] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797338] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 908.797457] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 908.916494] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 908.916757] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Creating directory with path [datastore1] vmware_temp/3cace94f-d4a9-4bec-a1ac-7f9135767ecc/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 908.916995] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2137445c-5838-4edd-a9ce-7556fc1f2e8d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.962307] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Created directory with path [datastore1] vmware_temp/3cace94f-d4a9-4bec-a1ac-7f9135767ecc/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 908.962514] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Fetch image to [datastore1] vmware_temp/3cace94f-d4a9-4bec-a1ac-7f9135767ecc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 908.962740] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/3cace94f-d4a9-4bec-a1ac-7f9135767ecc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 908.963516] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8968519-1d9a-4468-9da1-b04afab79f40 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.970555] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52a3793d-bb79-4b12-873f-0cd58b6f9414 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.984992] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-070d6503-c3dc-404f-b04a-ede8b51d9af0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.988763] env[67977]: DEBUG oslo_vmware.api [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Task: {'id': task-3468163, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074977} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 908.989011] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 908.989201] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 908.989371] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 908.989542] env[67977]: INFO nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Took 0.60 seconds to destroy the instance on the hypervisor. [ 908.991948] env[67977]: DEBUG nova.compute.claims [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 908.992136] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 908.992350] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 909.020977] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa769cc7-428a-4558-bd9f-3894bb3c14a5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.026872] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0f7f40ed-e746-4515-9497-365b865234fa {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.050319] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 909.106231] env[67977]: DEBUG oslo_vmware.rw_handles [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3cace94f-d4a9-4bec-a1ac-7f9135767ecc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 909.164801] env[67977]: DEBUG oslo_vmware.rw_handles [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 909.165030] env[67977]: DEBUG oslo_vmware.rw_handles [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3cace94f-d4a9-4bec-a1ac-7f9135767ecc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 909.435017] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f517c3a-97f1-4208-a2e4-b2ad343454b1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.443112] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3fbe3b7-94f9-4b23-8bd3-a72c06f9cc49 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.472084] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-294d977f-2b9a-42ee-9a60-08c83e57c1d1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.479175] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f06a584b-f649-442c-8ec3-eeb999a9e825 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.491967] env[67977]: DEBUG nova.compute.provider_tree [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 909.501156] env[67977]: DEBUG nova.scheduler.client.report [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 909.513789] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.521s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 909.514368] env[67977]: ERROR nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 909.514368] env[67977]: Faults: ['InvalidArgument'] [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Traceback (most recent call last): [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] self.driver.spawn(context, instance, image_meta, [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] self._fetch_image_if_missing(context, vi) [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] image_cache(vi, tmp_image_ds_loc) [ 909.514368] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] vm_util.copy_virtual_disk( [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] session._wait_for_task(vmdk_copy_task) [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] return self.wait_for_task(task_ref) [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] return evt.wait() [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] result = hub.switch() [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] return self.greenlet.switch() [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 909.514687] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] self.f(*self.args, **self.kw) [ 909.515236] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 909.515236] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] raise exceptions.translate_fault(task_info.error) [ 909.515236] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 909.515236] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Faults: ['InvalidArgument'] [ 909.515236] env[67977]: ERROR nova.compute.manager [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] [ 909.515236] env[67977]: DEBUG nova.compute.utils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 909.516450] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Build of instance b623a2f1-404e-4f48-aeb2-ebb372260a86 was re-scheduled: A specified parameter was not correct: fileType [ 909.516450] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 909.516824] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 909.516997] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 909.517183] env[67977]: DEBUG nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 909.517373] env[67977]: DEBUG nova.network.neutron [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 909.924023] env[67977]: DEBUG nova.network.neutron [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 909.936333] env[67977]: INFO nova.compute.manager [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b623a2f1-404e-4f48-aeb2-ebb372260a86] Took 0.42 seconds to deallocate network for instance. [ 910.038945] env[67977]: INFO nova.scheduler.client.report [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Deleted allocations for instance b623a2f1-404e-4f48-aeb2-ebb372260a86 [ 910.066843] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fb43a34a-73e3-4e93-9bf1-bae3e12c430e tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "b623a2f1-404e-4f48-aeb2-ebb372260a86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.756s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.098173] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.155660] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 910.155924] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 910.157827] env[67977]: INFO nova.compute.claims [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 910.576200] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7479b30-3577-49ae-bc74-aaf1e7febcbb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.584025] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbc53bad-0d6f-4209-99b3-6ab2c23f1f16 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.613729] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9047cb51-36d8-47ed-86a6-de0c75de02e8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.620769] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35b7c5c7-9088-44c0-9234-2ed7d413ffb3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.633648] env[67977]: DEBUG nova.compute.provider_tree [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 910.643362] env[67977]: DEBUG nova.scheduler.client.report [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 910.656995] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.501s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.657486] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 910.690796] env[67977]: DEBUG nova.compute.utils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 910.691983] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 910.692256] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 910.700439] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 910.764687] env[67977]: DEBUG nova.policy [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '47fe0af6e6a14b02860c64ad47acec73', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2bc90d7e60864418ab61c128ae20c558', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 910.773980] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 910.800427] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 910.800712] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 910.800882] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 910.801080] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 910.801233] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 910.801439] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 910.801614] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 910.801778] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 910.802128] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 910.802210] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 910.802374] env[67977]: DEBUG nova.virt.hardware [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 910.803246] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-152916be-983a-4bd4-93d2-63c22a053974 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.811678] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2827790f-0629-4fc9-aa82-1dc3b94217cb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.128361] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Successfully created port: 850e0bff-2f0b-42ec-807c-c397d8383288 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 911.920082] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "b8faf6c6-2531-44b9-8382-ddbc0feddf24" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 911.920328] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "b8faf6c6-2531-44b9-8382-ddbc0feddf24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 911.956145] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Successfully updated port: 850e0bff-2f0b-42ec-807c-c397d8383288 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 911.970759] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "refresh_cache-48d09ae0-ab95-45e8-a916-ecf24abb66a0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 911.970920] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquired lock "refresh_cache-48d09ae0-ab95-45e8-a916-ecf24abb66a0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 911.971090] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 912.036403] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 912.185635] env[67977]: DEBUG nova.compute.manager [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Received event network-vif-plugged-850e0bff-2f0b-42ec-807c-c397d8383288 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 912.185872] env[67977]: DEBUG oslo_concurrency.lockutils [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] Acquiring lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 912.186095] env[67977]: DEBUG oslo_concurrency.lockutils [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 912.186257] env[67977]: DEBUG oslo_concurrency.lockutils [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 912.186424] env[67977]: DEBUG nova.compute.manager [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] No waiting events found dispatching network-vif-plugged-850e0bff-2f0b-42ec-807c-c397d8383288 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 912.186590] env[67977]: WARNING nova.compute.manager [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Received unexpected event network-vif-plugged-850e0bff-2f0b-42ec-807c-c397d8383288 for instance with vm_state building and task_state spawning. [ 912.186749] env[67977]: DEBUG nova.compute.manager [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Received event network-changed-850e0bff-2f0b-42ec-807c-c397d8383288 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 912.186900] env[67977]: DEBUG nova.compute.manager [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Refreshing instance network info cache due to event network-changed-850e0bff-2f0b-42ec-807c-c397d8383288. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 912.187079] env[67977]: DEBUG oslo_concurrency.lockutils [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] Acquiring lock "refresh_cache-48d09ae0-ab95-45e8-a916-ecf24abb66a0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 912.263454] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Updating instance_info_cache with network_info: [{"id": "850e0bff-2f0b-42ec-807c-c397d8383288", "address": "fa:16:3e:3f:ca:2a", "network": {"id": "4670e76b-e766-4136-a26c-e801979a2dfa", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-2063443269-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bc90d7e60864418ab61c128ae20c558", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "796c7fcb-00fd-4692-a44b-7ec550201e86", "external-id": "nsx-vlan-transportzone-42", "segmentation_id": 42, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap850e0bff-2f", "ovs_interfaceid": "850e0bff-2f0b-42ec-807c-c397d8383288", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 912.279844] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Releasing lock "refresh_cache-48d09ae0-ab95-45e8-a916-ecf24abb66a0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 912.280191] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Instance network_info: |[{"id": "850e0bff-2f0b-42ec-807c-c397d8383288", "address": "fa:16:3e:3f:ca:2a", "network": {"id": "4670e76b-e766-4136-a26c-e801979a2dfa", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-2063443269-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bc90d7e60864418ab61c128ae20c558", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "796c7fcb-00fd-4692-a44b-7ec550201e86", "external-id": "nsx-vlan-transportzone-42", "segmentation_id": 42, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap850e0bff-2f", "ovs_interfaceid": "850e0bff-2f0b-42ec-807c-c397d8383288", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 912.280531] env[67977]: DEBUG oslo_concurrency.lockutils [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] Acquired lock "refresh_cache-48d09ae0-ab95-45e8-a916-ecf24abb66a0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 912.280730] env[67977]: DEBUG nova.network.neutron [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Refreshing network info cache for port 850e0bff-2f0b-42ec-807c-c397d8383288 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 912.281867] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3f:ca:2a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '796c7fcb-00fd-4692-a44b-7ec550201e86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '850e0bff-2f0b-42ec-807c-c397d8383288', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 912.290273] env[67977]: DEBUG oslo.service.loopingcall [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 912.291200] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 912.293773] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-301215de-91d2-4d50-948f-5a224270a796 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 912.315764] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 912.315764] env[67977]: value = "task-3468164" [ 912.315764] env[67977]: _type = "Task" [ 912.315764] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 912.323860] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468164, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 912.591592] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquiring lock "f6e698af-6d7e-40d5-988b-450f300b67a1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 912.795942] env[67977]: DEBUG nova.network.neutron [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Updated VIF entry in instance network info cache for port 850e0bff-2f0b-42ec-807c-c397d8383288. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 912.795942] env[67977]: DEBUG nova.network.neutron [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Updating instance_info_cache with network_info: [{"id": "850e0bff-2f0b-42ec-807c-c397d8383288", "address": "fa:16:3e:3f:ca:2a", "network": {"id": "4670e76b-e766-4136-a26c-e801979a2dfa", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-2063443269-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bc90d7e60864418ab61c128ae20c558", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "796c7fcb-00fd-4692-a44b-7ec550201e86", "external-id": "nsx-vlan-transportzone-42", "segmentation_id": 42, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap850e0bff-2f", "ovs_interfaceid": "850e0bff-2f0b-42ec-807c-c397d8383288", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 912.806088] env[67977]: DEBUG oslo_concurrency.lockutils [req-4afeb5dc-bbfb-40d1-92d8-84db35f56141 req-095bafca-8950-482a-b155-5f3eede4a48f service nova] Releasing lock "refresh_cache-48d09ae0-ab95-45e8-a916-ecf24abb66a0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 912.826026] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468164, 'name': CreateVM_Task, 'duration_secs': 0.413689} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 912.828283] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 912.828283] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 912.828283] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 912.828283] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 912.828283] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fbd8c019-2fac-487a-87f6-a9d463b5d18d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 912.832396] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for the task: (returnval){ [ 912.832396] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ea975a-cc4e-0558-80fd-95560a25375f" [ 912.832396] env[67977]: _type = "Task" [ 912.832396] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 912.841305] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ea975a-cc4e-0558-80fd-95560a25375f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 913.342775] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 913.343091] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 913.343322] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 913.391835] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquiring lock "02dea9f7-00be-4305-909c-ab9245b60e1d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 914.127921] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquiring lock "faf24c4e-135e-47df-85a6-05024bc9b64b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 923.598783] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquiring lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 935.105977] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 939.661492] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquiring lock "b22ae1a7-c9b8-464b-a81c-73144a0176be" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 939.915230] env[67977]: DEBUG oslo_concurrency.lockutils [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquiring lock "83b04c8c-39f6-4f58-b965-0a94c063b68b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 942.500511] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 942.916015] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 944.270567] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquiring lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 944.270845] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 945.184384] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b7a53d53-481b-45af-a107-3a3ee04cbab0 tempest-ServersAaction247Test-1483982511 tempest-ServersAaction247Test-1483982511-project-member] Acquiring lock "2a175b1b-e44c-4fd0-801d-445ba66a993c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 945.184610] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b7a53d53-481b-45af-a107-3a3ee04cbab0 tempest-ServersAaction247Test-1483982511 tempest-ServersAaction247Test-1483982511-project-member] Lock "2a175b1b-e44c-4fd0-801d-445ba66a993c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 945.958096] env[67977]: DEBUG oslo_concurrency.lockutils [None req-25e4148e-24fd-409e-a4b4-c0dad5de3c41 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] Acquiring lock "4aa9df28-0b4e-4aef-a647-cd1bd3b15c66" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 945.958390] env[67977]: DEBUG oslo_concurrency.lockutils [None req-25e4148e-24fd-409e-a4b4-c0dad5de3c41 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] Lock "4aa9df28-0b4e-4aef-a647-cd1bd3b15c66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 946.927387] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4fa4a161-da96-45e1-87a9-8ed3231d2db8 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] Acquiring lock "597679a6-42e4-4f77-af28-2eaca094f728" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 946.927886] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4fa4a161-da96-45e1-87a9-8ed3231d2db8 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] Lock "597679a6-42e4-4f77-af28-2eaca094f728" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 947.307160] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b43826e5-ee66-41b0-abe3-9993c21ea7f8 tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] Acquiring lock "04d4d42b-1ff6-4159-8a34-1b3549d127c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 947.307542] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b43826e5-ee66-41b0-abe3-9993c21ea7f8 tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] Lock "04d4d42b-1ff6-4159-8a34-1b3549d127c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 947.655627] env[67977]: DEBUG oslo_concurrency.lockutils [None req-2796c75f-eca8-45c4-bf85-a28a4edd1357 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] Acquiring lock "4cfb8096-93f1-4400-bb3a-5a2af940532e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 947.655891] env[67977]: DEBUG oslo_concurrency.lockutils [None req-2796c75f-eca8-45c4-bf85-a28a4edd1357 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] Lock "4cfb8096-93f1-4400-bb3a-5a2af940532e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 948.261708] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4245fcf7-c84a-4234-bd22-8a1cfde48e3f tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] Acquiring lock "a294e292-bc5a-4e79-8224-1cb8c201e81d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 948.261988] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4245fcf7-c84a-4234-bd22-8a1cfde48e3f tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] Lock "a294e292-bc5a-4e79-8224-1cb8c201e81d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 959.060681] env[67977]: WARNING oslo_vmware.rw_handles [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 959.060681] env[67977]: ERROR oslo_vmware.rw_handles [ 959.061294] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/3cace94f-d4a9-4bec-a1ac-7f9135767ecc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 959.063105] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 959.064042] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Copying Virtual Disk [datastore1] vmware_temp/3cace94f-d4a9-4bec-a1ac-7f9135767ecc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/3cace94f-d4a9-4bec-a1ac-7f9135767ecc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 959.064042] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-41c79090-4c03-4ddb-ab4b-95d4cd2e6c71 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.072857] env[67977]: DEBUG oslo_vmware.api [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Waiting for the task: (returnval){ [ 959.072857] env[67977]: value = "task-3468165" [ 959.072857] env[67977]: _type = "Task" [ 959.072857] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 959.086332] env[67977]: DEBUG oslo_vmware.api [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Task: {'id': task-3468165, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 959.586584] env[67977]: DEBUG oslo_vmware.exceptions [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 959.586908] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 959.589664] env[67977]: ERROR nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 959.589664] env[67977]: Faults: ['InvalidArgument'] [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Traceback (most recent call last): [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] yield resources [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] self.driver.spawn(context, instance, image_meta, [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] self._fetch_image_if_missing(context, vi) [ 959.589664] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] image_cache(vi, tmp_image_ds_loc) [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] vm_util.copy_virtual_disk( [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] session._wait_for_task(vmdk_copy_task) [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] return self.wait_for_task(task_ref) [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] return evt.wait() [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] result = hub.switch() [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 959.590132] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] return self.greenlet.switch() [ 959.590971] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 959.590971] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] self.f(*self.args, **self.kw) [ 959.590971] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 959.590971] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] raise exceptions.translate_fault(task_info.error) [ 959.590971] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 959.590971] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Faults: ['InvalidArgument'] [ 959.590971] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] [ 959.590971] env[67977]: INFO nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Terminating instance [ 959.591620] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 959.591822] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 959.592679] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 959.593055] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 959.593246] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4bfa5ad4-fc39-4dcf-ac1c-ea41e6490ba1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.595861] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-137a0c3c-caf4-4a1a-aaca-d2067c92008c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.605561] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 959.605561] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0b810b03-b966-4b18-91a7-11ade81bcc56 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.609108] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 959.609307] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 959.611043] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e62d8da2-d078-4a5b-84a5-7ff9d3ab08eb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.616347] env[67977]: DEBUG oslo_vmware.api [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Waiting for the task: (returnval){ [ 959.616347] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52299bf5-8f44-3db8-cfc9-a383b828eb94" [ 959.616347] env[67977]: _type = "Task" [ 959.616347] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 959.625726] env[67977]: DEBUG oslo_vmware.api [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52299bf5-8f44-3db8-cfc9-a383b828eb94, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 959.681879] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 959.682043] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 959.682265] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Deleting the datastore file [datastore1] f6e698af-6d7e-40d5-988b-450f300b67a1 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 959.682584] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-182ab522-c954-427c-8cbb-02e6c27d1e35 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.690858] env[67977]: DEBUG oslo_vmware.api [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Waiting for the task: (returnval){ [ 959.690858] env[67977]: value = "task-3468167" [ 959.690858] env[67977]: _type = "Task" [ 959.690858] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 959.701831] env[67977]: DEBUG oslo_vmware.api [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Task: {'id': task-3468167, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 960.126647] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 960.126951] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Creating directory with path [datastore1] vmware_temp/fcfb2e7a-1e06-4788-9ae9-69790c54b02c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 960.130017] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ecd0f8b7-af0e-4e08-a3c6-78eeab0352b9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.145317] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Created directory with path [datastore1] vmware_temp/fcfb2e7a-1e06-4788-9ae9-69790c54b02c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 960.145542] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Fetch image to [datastore1] vmware_temp/fcfb2e7a-1e06-4788-9ae9-69790c54b02c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 960.146978] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/fcfb2e7a-1e06-4788-9ae9-69790c54b02c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 960.146978] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb3f23f9-d8f2-4df9-8fed-c7fec72f57a0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.153991] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72e1db66-84fa-47ba-b929-7690264e8b7f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.164381] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa65aad3-0b2e-40b8-8429-4a68070f461a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.208594] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b55efd89-c99c-4c44-b19b-ee379bc82872 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.217091] env[67977]: DEBUG oslo_vmware.api [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Task: {'id': task-3468167, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066298} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 960.220778] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 960.220778] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 960.220778] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 960.221093] env[67977]: INFO nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Took 0.63 seconds to destroy the instance on the hypervisor. [ 960.225439] env[67977]: DEBUG nova.compute.claims [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 960.225439] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 960.225439] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 960.227133] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ea5c7f65-0ef7-42de-9a32-81492cf8525c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.252879] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 960.333844] env[67977]: DEBUG oslo_vmware.rw_handles [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fcfb2e7a-1e06-4788-9ae9-69790c54b02c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 960.399213] env[67977]: DEBUG oslo_vmware.rw_handles [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 960.399404] env[67977]: DEBUG oslo_vmware.rw_handles [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fcfb2e7a-1e06-4788-9ae9-69790c54b02c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 960.820311] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcc30010-fc48-478f-8956-bdf6b1b837d3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.834365] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbe82c47-1c4b-4424-84d2-68ff907918a2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.866135] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ef9470e-b2e3-45cd-8a7c-e6190d27d6f9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.874214] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ae1111f-0668-40a4-afde-d32c04c23e1e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 960.887735] env[67977]: DEBUG nova.compute.provider_tree [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 960.905094] env[67977]: DEBUG nova.scheduler.client.report [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 960.926910] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.703s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 960.927973] env[67977]: ERROR nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 960.927973] env[67977]: Faults: ['InvalidArgument'] [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Traceback (most recent call last): [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] self.driver.spawn(context, instance, image_meta, [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] self._fetch_image_if_missing(context, vi) [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] image_cache(vi, tmp_image_ds_loc) [ 960.927973] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] vm_util.copy_virtual_disk( [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] session._wait_for_task(vmdk_copy_task) [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] return self.wait_for_task(task_ref) [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] return evt.wait() [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] result = hub.switch() [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] return self.greenlet.switch() [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 960.928339] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] self.f(*self.args, **self.kw) [ 960.928658] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 960.928658] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] raise exceptions.translate_fault(task_info.error) [ 960.928658] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 960.928658] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Faults: ['InvalidArgument'] [ 960.928658] env[67977]: ERROR nova.compute.manager [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] [ 960.928782] env[67977]: DEBUG nova.compute.utils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 960.930984] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Build of instance f6e698af-6d7e-40d5-988b-450f300b67a1 was re-scheduled: A specified parameter was not correct: fileType [ 960.930984] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 960.931473] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 960.931710] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 960.931917] env[67977]: DEBUG nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 960.932138] env[67977]: DEBUG nova.network.neutron [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 961.465376] env[67977]: DEBUG nova.network.neutron [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 961.489125] env[67977]: INFO nova.compute.manager [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Took 0.55 seconds to deallocate network for instance. [ 961.654335] env[67977]: INFO nova.scheduler.client.report [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Deleted allocations for instance f6e698af-6d7e-40d5-988b-450f300b67a1 [ 961.694106] env[67977]: DEBUG oslo_concurrency.lockutils [None req-15516c7f-15b7-45d5-b74d-d31b0184d1f1 tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "f6e698af-6d7e-40d5-988b-450f300b67a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 249.092s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 961.696620] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "f6e698af-6d7e-40d5-988b-450f300b67a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 49.105s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 961.702998] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Acquiring lock "f6e698af-6d7e-40d5-988b-450f300b67a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 961.702998] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "f6e698af-6d7e-40d5-988b-450f300b67a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 961.702998] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "f6e698af-6d7e-40d5-988b-450f300b67a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 961.703882] env[67977]: INFO nova.compute.manager [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Terminating instance [ 961.707951] env[67977]: DEBUG nova.compute.manager [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 961.708604] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 961.708987] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2cf6b7ef-9576-465c-b472-d1e2c067ed65 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 961.715353] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: b017d568-1ad8-4d8d-84e8-5771341389bf] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 961.723263] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa6f4460-b71b-42f1-9983-eec9504d5b37 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 961.753942] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f6e698af-6d7e-40d5-988b-450f300b67a1 could not be found. [ 961.754180] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 961.754363] env[67977]: INFO nova.compute.manager [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Took 0.05 seconds to destroy the instance on the hypervisor. [ 961.754616] env[67977]: DEBUG oslo.service.loopingcall [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 961.757425] env[67977]: DEBUG nova.compute.manager [-] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 961.757425] env[67977]: DEBUG nova.network.neutron [-] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 961.758797] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: b017d568-1ad8-4d8d-84e8-5771341389bf] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 961.796882] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "b017d568-1ad8-4d8d-84e8-5771341389bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.965s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 961.819745] env[67977]: DEBUG nova.network.neutron [-] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 961.821411] env[67977]: DEBUG nova.compute.manager [None req-4c82ff3b-f1a7-4838-bede-750271ea1ecd tempest-FloatingIPsAssociationNegativeTestJSON-865865848 tempest-FloatingIPsAssociationNegativeTestJSON-865865848-project-member] [instance: ee6c409a-0d32-48fa-a873-b9b62040aef7] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 961.834644] env[67977]: INFO nova.compute.manager [-] [instance: f6e698af-6d7e-40d5-988b-450f300b67a1] Took 0.08 seconds to deallocate network for instance. [ 961.855868] env[67977]: DEBUG nova.compute.manager [None req-4c82ff3b-f1a7-4838-bede-750271ea1ecd tempest-FloatingIPsAssociationNegativeTestJSON-865865848 tempest-FloatingIPsAssociationNegativeTestJSON-865865848-project-member] [instance: ee6c409a-0d32-48fa-a873-b9b62040aef7] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 961.887275] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c82ff3b-f1a7-4838-bede-750271ea1ecd tempest-FloatingIPsAssociationNegativeTestJSON-865865848 tempest-FloatingIPsAssociationNegativeTestJSON-865865848-project-member] Lock "ee6c409a-0d32-48fa-a873-b9b62040aef7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.823s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 961.901661] env[67977]: DEBUG nova.compute.manager [None req-c660dc3a-9f41-4637-9299-21ad45adf48e tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 7db91c79-1cdb-4101-a369-583b8bbae870] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 961.934208] env[67977]: DEBUG nova.compute.manager [None req-c660dc3a-9f41-4637-9299-21ad45adf48e tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 7db91c79-1cdb-4101-a369-583b8bbae870] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 961.964639] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c660dc3a-9f41-4637-9299-21ad45adf48e tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "7db91c79-1cdb-4101-a369-583b8bbae870" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.350s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 961.966298] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0ed7301a-09d0-437e-9ea6-0c068ab21ffe tempest-TenantUsagesTestJSON-842680904 tempest-TenantUsagesTestJSON-842680904-project-member] Lock "f6e698af-6d7e-40d5-988b-450f300b67a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.270s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 961.979636] env[67977]: DEBUG nova.compute.manager [None req-7e45e376-a36f-4839-a332-410ca11ed964 tempest-ServerExternalEventsTest-819465719 tempest-ServerExternalEventsTest-819465719-project-member] [instance: e1027e0e-7938-4772-84c2-f879e9ce4144] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.010081] env[67977]: DEBUG nova.compute.manager [None req-7e45e376-a36f-4839-a332-410ca11ed964 tempest-ServerExternalEventsTest-819465719 tempest-ServerExternalEventsTest-819465719-project-member] [instance: e1027e0e-7938-4772-84c2-f879e9ce4144] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 962.045318] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7e45e376-a36f-4839-a332-410ca11ed964 tempest-ServerExternalEventsTest-819465719 tempest-ServerExternalEventsTest-819465719-project-member] Lock "e1027e0e-7938-4772-84c2-f879e9ce4144" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.226s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 962.066034] env[67977]: DEBUG nova.compute.manager [None req-850f3328-669b-44f9-bf28-bee378ec3316 tempest-ServerAddressesNegativeTestJSON-1982447062 tempest-ServerAddressesNegativeTestJSON-1982447062-project-member] [instance: 3963518a-23de-434e-9f88-392a80daf120] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.106913] env[67977]: DEBUG nova.compute.manager [None req-850f3328-669b-44f9-bf28-bee378ec3316 tempest-ServerAddressesNegativeTestJSON-1982447062 tempest-ServerAddressesNegativeTestJSON-1982447062-project-member] [instance: 3963518a-23de-434e-9f88-392a80daf120] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 962.144946] env[67977]: DEBUG oslo_concurrency.lockutils [None req-850f3328-669b-44f9-bf28-bee378ec3316 tempest-ServerAddressesNegativeTestJSON-1982447062 tempest-ServerAddressesNegativeTestJSON-1982447062-project-member] Lock "3963518a-23de-434e-9f88-392a80daf120" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.523s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 962.158026] env[67977]: DEBUG nova.compute.manager [None req-4e5c801c-f63d-455e-abe6-da5996ab2bd1 tempest-VolumesAssistedSnapshotsTest-1609573494 tempest-VolumesAssistedSnapshotsTest-1609573494-project-member] [instance: b462ac2e-d668-4ac1-a6b3-2cfb49a2e0cd] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.210445] env[67977]: DEBUG nova.compute.manager [None req-4e5c801c-f63d-455e-abe6-da5996ab2bd1 tempest-VolumesAssistedSnapshotsTest-1609573494 tempest-VolumesAssistedSnapshotsTest-1609573494-project-member] [instance: b462ac2e-d668-4ac1-a6b3-2cfb49a2e0cd] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 962.263875] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4e5c801c-f63d-455e-abe6-da5996ab2bd1 tempest-VolumesAssistedSnapshotsTest-1609573494 tempest-VolumesAssistedSnapshotsTest-1609573494-project-member] Lock "b462ac2e-d668-4ac1-a6b3-2cfb49a2e0cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.330s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 962.280240] env[67977]: DEBUG nova.compute.manager [None req-6ce1fff8-9173-4d48-8e91-86a16d3be6e9 tempest-ServersWithSpecificFlavorTestJSON-2118342104 tempest-ServersWithSpecificFlavorTestJSON-2118342104-project-member] [instance: 8870c8cf-bf83-482d-91a9-47fdedc79586] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.321168] env[67977]: DEBUG nova.compute.manager [None req-6ce1fff8-9173-4d48-8e91-86a16d3be6e9 tempest-ServersWithSpecificFlavorTestJSON-2118342104 tempest-ServersWithSpecificFlavorTestJSON-2118342104-project-member] [instance: 8870c8cf-bf83-482d-91a9-47fdedc79586] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 962.360660] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6ce1fff8-9173-4d48-8e91-86a16d3be6e9 tempest-ServersWithSpecificFlavorTestJSON-2118342104 tempest-ServersWithSpecificFlavorTestJSON-2118342104-project-member] Lock "8870c8cf-bf83-482d-91a9-47fdedc79586" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.963s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 962.373611] env[67977]: DEBUG nova.compute.manager [None req-939b9d0b-252f-4f4e-b159-2fb267c491f9 tempest-ServerDiagnosticsTest-1479388244 tempest-ServerDiagnosticsTest-1479388244-project-member] [instance: bb8b7561-424e-48ba-9faa-65d6f6465a20] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.401378] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquiring lock "32d860b3-f438-400f-8296-e62cc662d618" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 962.405191] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "32d860b3-f438-400f-8296-e62cc662d618" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 962.413582] env[67977]: DEBUG nova.compute.manager [None req-939b9d0b-252f-4f4e-b159-2fb267c491f9 tempest-ServerDiagnosticsTest-1479388244 tempest-ServerDiagnosticsTest-1479388244-project-member] [instance: bb8b7561-424e-48ba-9faa-65d6f6465a20] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 962.448678] env[67977]: DEBUG oslo_concurrency.lockutils [None req-939b9d0b-252f-4f4e-b159-2fb267c491f9 tempest-ServerDiagnosticsTest-1479388244 tempest-ServerDiagnosticsTest-1479388244-project-member] Lock "bb8b7561-424e-48ba-9faa-65d6f6465a20" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.690s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 962.461579] env[67977]: DEBUG nova.compute.manager [None req-8273f9d9-aa1c-4d37-8c7f-7b17bde62d7c tempest-AttachInterfacesV270Test-1989404753 tempest-AttachInterfacesV270Test-1989404753-project-member] [instance: d6893024-9531-435b-8893-38f310224d7b] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.487082] env[67977]: DEBUG nova.compute.manager [None req-8273f9d9-aa1c-4d37-8c7f-7b17bde62d7c tempest-AttachInterfacesV270Test-1989404753 tempest-AttachInterfacesV270Test-1989404753-project-member] [instance: d6893024-9531-435b-8893-38f310224d7b] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 962.509032] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8273f9d9-aa1c-4d37-8c7f-7b17bde62d7c tempest-AttachInterfacesV270Test-1989404753 tempest-AttachInterfacesV270Test-1989404753-project-member] Lock "d6893024-9531-435b-8893-38f310224d7b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.519s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 962.522688] env[67977]: DEBUG nova.compute.manager [None req-abeac5ed-1bd2-4ec7-91ee-5fab80fd36c9 tempest-ServersTestJSON-1266629660 tempest-ServersTestJSON-1266629660-project-member] [instance: 0d6c2ea5-71ff-49bb-ae23-3323e9b4a3f6] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.552665] env[67977]: DEBUG nova.compute.manager [None req-abeac5ed-1bd2-4ec7-91ee-5fab80fd36c9 tempest-ServersTestJSON-1266629660 tempest-ServersTestJSON-1266629660-project-member] [instance: 0d6c2ea5-71ff-49bb-ae23-3323e9b4a3f6] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 962.575984] env[67977]: DEBUG oslo_concurrency.lockutils [None req-abeac5ed-1bd2-4ec7-91ee-5fab80fd36c9 tempest-ServersTestJSON-1266629660 tempest-ServersTestJSON-1266629660-project-member] Lock "0d6c2ea5-71ff-49bb-ae23-3323e9b4a3f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.657s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 962.586239] env[67977]: DEBUG nova.compute.manager [None req-827b8bbb-c3bf-43cb-a135-ab3275182ba2 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 72a209af-5976-4943-9752-8c258bb24158] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.613721] env[67977]: DEBUG nova.compute.manager [None req-827b8bbb-c3bf-43cb-a135-ab3275182ba2 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 72a209af-5976-4943-9752-8c258bb24158] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 962.639226] env[67977]: DEBUG oslo_concurrency.lockutils [None req-827b8bbb-c3bf-43cb-a135-ab3275182ba2 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "72a209af-5976-4943-9752-8c258bb24158" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.638s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 962.650274] env[67977]: DEBUG nova.compute.manager [None req-b59f81c3-59c6-487a-a1d4-f0c4b6f3c5f0 tempest-ServersAdminNegativeTestJSON-393042103 tempest-ServersAdminNegativeTestJSON-393042103-project-member] [instance: 21a172f7-20d4-4f17-af4d-cadc0fa33c1d] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.678432] env[67977]: DEBUG nova.compute.manager [None req-b59f81c3-59c6-487a-a1d4-f0c4b6f3c5f0 tempest-ServersAdminNegativeTestJSON-393042103 tempest-ServersAdminNegativeTestJSON-393042103-project-member] [instance: 21a172f7-20d4-4f17-af4d-cadc0fa33c1d] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 962.707784] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b59f81c3-59c6-487a-a1d4-f0c4b6f3c5f0 tempest-ServersAdminNegativeTestJSON-393042103 tempest-ServersAdminNegativeTestJSON-393042103-project-member] Lock "21a172f7-20d4-4f17-af4d-cadc0fa33c1d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.436s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 962.719485] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 962.775882] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 962.776318] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 962.799926] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] There are 1 instances to clean {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 962.800355] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1b5b8be5-7e9c-4269-994a-e54aeb75774f] Instance has had 0 of 5 cleanup attempts {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11211}} [ 962.809812] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 962.811066] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 962.813969] env[67977]: INFO nova.compute.claims [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 962.875100] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 962.875303] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances with incomplete migration {{(pid=67977) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 962.884497] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 963.319507] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ac87453-1dd7-4b79-9db1-d056e676884d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.328783] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afad66bb-1512-4d0f-b524-3af39902070a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.362393] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7777b7be-e5f4-4d35-87f7-c2a73be9a0cc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.370456] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-774fdd6c-7510-43d9-99bf-28da75bc0f5b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.384676] env[67977]: DEBUG nova.compute.provider_tree [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 963.395093] env[67977]: DEBUG nova.scheduler.client.report [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 963.413948] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.603s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 963.414970] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 963.467366] env[67977]: DEBUG nova.compute.utils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 963.469724] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 963.469938] env[67977]: DEBUG nova.network.neutron [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 963.481898] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 963.572573] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 963.581274] env[67977]: DEBUG nova.policy [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78df84566c65469890b3b6f15f3e5e01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ff581ae563e45108f497cade6990d79', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 963.606849] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 963.607145] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 963.607268] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 963.607450] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 963.607770] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 963.607770] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 963.608022] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 963.608223] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 963.608427] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 963.608859] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 963.608859] env[67977]: DEBUG nova.virt.hardware [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 963.609788] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d50dd0d7-2d8c-4e48-b78e-0e2c4e6773cc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.618236] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2081304b-5e30-4a70-8b40-a96254085595 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.570378] env[67977]: DEBUG nova.network.neutron [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Successfully created port: 44e07e4c-c632-43af-be6c-38fd7f61490a {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 964.687129] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "d7719b11-cef7-4878-a693-24dcd085a1d7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 965.432268] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f0afa998-2ebe-495b-970d-dd324c8fb750 tempest-InstanceActionsNegativeTestJSON-546422566 tempest-InstanceActionsNegativeTestJSON-546422566-project-member] Acquiring lock "cfc3a5c1-ec4a-41c6-911f-96bc0586d17e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 965.432523] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f0afa998-2ebe-495b-970d-dd324c8fb750 tempest-InstanceActionsNegativeTestJSON-546422566 tempest-InstanceActionsNegativeTestJSON-546422566-project-member] Lock "cfc3a5c1-ec4a-41c6-911f-96bc0586d17e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 966.625800] env[67977]: DEBUG nova.network.neutron [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Successfully updated port: 44e07e4c-c632-43af-be6c-38fd7f61490a {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 966.648032] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "refresh_cache-d7719b11-cef7-4878-a693-24dcd085a1d7" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 966.649670] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "refresh_cache-d7719b11-cef7-4878-a693-24dcd085a1d7" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 966.651776] env[67977]: DEBUG nova.network.neutron [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 966.753780] env[67977]: DEBUG nova.network.neutron [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 966.887068] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 966.887339] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 966.887501] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 966.887682] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 967.425292] env[67977]: DEBUG nova.network.neutron [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Updating instance_info_cache with network_info: [{"id": "44e07e4c-c632-43af-be6c-38fd7f61490a", "address": "fa:16:3e:7b:35:53", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44e07e4c-c6", "ovs_interfaceid": "44e07e4c-c632-43af-be6c-38fd7f61490a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 967.441606] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "refresh_cache-d7719b11-cef7-4878-a693-24dcd085a1d7" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 967.441774] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Instance network_info: |[{"id": "44e07e4c-c632-43af-be6c-38fd7f61490a", "address": "fa:16:3e:7b:35:53", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44e07e4c-c6", "ovs_interfaceid": "44e07e4c-c632-43af-be6c-38fd7f61490a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 967.442276] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7b:35:53', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5efce30e-48dd-493a-a354-f562a8adf7af', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '44e07e4c-c632-43af-be6c-38fd7f61490a', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 967.450243] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating folder: Project (4ff581ae563e45108f497cade6990d79). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 967.450844] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7abb3b45-1dae-49d4-ac26-5de81f2bae80 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.463089] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created folder: Project (4ff581ae563e45108f497cade6990d79) in parent group-v693022. [ 967.463293] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating folder: Instances. Parent ref: group-v693071. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 967.463541] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-64431cad-9517-42a4-ad2c-572132e95d10 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.474204] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created folder: Instances in parent group-v693071. [ 967.474460] env[67977]: DEBUG oslo.service.loopingcall [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 967.474653] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 967.474970] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cc4eaa5f-5505-443e-9dae-2e7e45b86539 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.496082] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 967.496082] env[67977]: value = "task-3468170" [ 967.496082] env[67977]: _type = "Task" [ 967.496082] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 967.504210] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468170, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 967.749259] env[67977]: DEBUG nova.compute.manager [req-aa42354e-5f5e-4744-9695-f1cd1de13803 req-4c379ba3-a16f-4545-a8c5-cbeeab5e4b23 service nova] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Received event network-vif-plugged-44e07e4c-c632-43af-be6c-38fd7f61490a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 967.749259] env[67977]: DEBUG oslo_concurrency.lockutils [req-aa42354e-5f5e-4744-9695-f1cd1de13803 req-4c379ba3-a16f-4545-a8c5-cbeeab5e4b23 service nova] Acquiring lock "d7719b11-cef7-4878-a693-24dcd085a1d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 967.751234] env[67977]: DEBUG oslo_concurrency.lockutils [req-aa42354e-5f5e-4744-9695-f1cd1de13803 req-4c379ba3-a16f-4545-a8c5-cbeeab5e4b23 service nova] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 967.751234] env[67977]: DEBUG oslo_concurrency.lockutils [req-aa42354e-5f5e-4744-9695-f1cd1de13803 req-4c379ba3-a16f-4545-a8c5-cbeeab5e4b23 service nova] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 967.751234] env[67977]: DEBUG nova.compute.manager [req-aa42354e-5f5e-4744-9695-f1cd1de13803 req-4c379ba3-a16f-4545-a8c5-cbeeab5e4b23 service nova] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] No waiting events found dispatching network-vif-plugged-44e07e4c-c632-43af-be6c-38fd7f61490a {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 967.751234] env[67977]: WARNING nova.compute.manager [req-aa42354e-5f5e-4744-9695-f1cd1de13803 req-4c379ba3-a16f-4545-a8c5-cbeeab5e4b23 service nova] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Received unexpected event network-vif-plugged-44e07e4c-c632-43af-be6c-38fd7f61490a for instance with vm_state building and task_state deleting. [ 967.775419] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 967.776896] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 967.776896] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 967.776896] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 967.792054] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 967.792294] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 967.792498] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 967.792663] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 967.794332] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c0d9713-1b41-430f-9863-461dc54bcf8a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.803333] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa1b578f-3638-42fd-97b2-760cc0b89283 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.827362] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d060c535-3bf3-4846-b3fd-a1be049c4b94 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.834390] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f46fa6d-41cf-4ccf-9c0a-1e6e5b60b584 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.870025] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180949MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 967.870025] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 967.870025] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 968.012092] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468170, 'name': CreateVM_Task, 'duration_secs': 0.304977} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 968.012092] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 968.017114] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 968.017114] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 968.017114] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 968.017114] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f65ddc73-a30a-4386-8e3f-d92f07cbbf1a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.020375] env[67977]: DEBUG oslo_vmware.api [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 968.020375] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52c1022b-3442-393a-8c76-c4b43029a3d6" [ 968.020375] env[67977]: _type = "Task" [ 968.020375] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 968.040266] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 968.040584] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 968.040821] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 968.110673] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance faf24c4e-135e-47df-85a6-05024bc9b64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.110826] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 02dea9f7-00be-4305-909c-ab9245b60e1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.110951] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a2fd776e-9a01-4b67-bc23-1605d6e2b23e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.111089] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.111207] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.111321] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b22ae1a7-c9b8-464b-a81c-73144a0176be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.111431] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.111539] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.111646] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.111753] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 968.126204] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e7543070-519f-470d-b3dd-964b60ce149f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.144156] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance de7e2949-00a0-4ce7-9a54-c678d8722464 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.164143] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4a59ec41-924b-4eb0-a025-4820479d535b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.185177] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2a2e7c1d-af91-48c8-bbbf-3265d7407bb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.200451] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e0c3bec9-6a83-4104-87db-673f90fb1247 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.223177] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 05ea43b1-42c7-464b-89c9-b405f7ba20da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.242153] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 20642d86-67cd-41ee-ac01-d59fcb5d6243 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.256767] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.276314] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.294472] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b8faf6c6-2531-44b9-8382-ddbc0feddf24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.310307] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.323030] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2a175b1b-e44c-4fd0-801d-445ba66a993c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.341881] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4aa9df28-0b4e-4aef-a647-cd1bd3b15c66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.353848] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 597679a6-42e4-4f77-af28-2eaca094f728 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.370925] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 04d4d42b-1ff6-4159-8a34-1b3549d127c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.388211] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4cfb8096-93f1-4400-bb3a-5a2af940532e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.402013] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a294e292-bc5a-4e79-8224-1cb8c201e81d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.431207] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.444591] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance cfc3a5c1-ec4a-41c6-911f-96bc0586d17e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 968.444839] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 968.444984] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 968.472473] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing inventories for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 968.492849] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating ProviderTree inventory for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 968.493209] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating inventory in ProviderTree for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 968.515794] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing aggregate associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, aggregates: None {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 968.550862] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing trait associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 969.117859] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab1ef08a-7b78-4cb1-a40b-a4d4c4a718db {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.126153] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d86ef264-2232-40e6-a075-e6d325766ac9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.164452] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ca6320a-ca41-4690-8f4d-473588b4bf44 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.172679] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e0a3e0b-0ef8-456f-911c-0ae8534af24d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.188189] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 969.198472] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 969.221556] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 969.221772] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.352s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 970.224642] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 970.426688] env[67977]: DEBUG oslo_concurrency.lockutils [None req-567aae11-b45a-4166-b527-206e03bac6d0 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "2977d1a5-655c-4dda-bd9e-3664770f3b62" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 970.428026] env[67977]: DEBUG oslo_concurrency.lockutils [None req-567aae11-b45a-4166-b527-206e03bac6d0 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "2977d1a5-655c-4dda-bd9e-3664770f3b62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 970.775365] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 970.775558] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 970.775688] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 970.804670] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.804900] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.805097] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.805267] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.805438] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.805595] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.805762] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.805914] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.806094] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.806276] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 970.806446] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 970.927502] env[67977]: DEBUG nova.compute.manager [req-652e5d86-84cb-46e3-9fa2-7710b36448b9 req-5341e493-b30f-4706-85f0-1de1161e5ae8 service nova] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Received event network-changed-44e07e4c-c632-43af-be6c-38fd7f61490a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 970.927774] env[67977]: DEBUG nova.compute.manager [req-652e5d86-84cb-46e3-9fa2-7710b36448b9 req-5341e493-b30f-4706-85f0-1de1161e5ae8 service nova] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Refreshing instance network info cache due to event network-changed-44e07e4c-c632-43af-be6c-38fd7f61490a. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 970.928030] env[67977]: DEBUG oslo_concurrency.lockutils [req-652e5d86-84cb-46e3-9fa2-7710b36448b9 req-5341e493-b30f-4706-85f0-1de1161e5ae8 service nova] Acquiring lock "refresh_cache-d7719b11-cef7-4878-a693-24dcd085a1d7" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 970.928375] env[67977]: DEBUG oslo_concurrency.lockutils [req-652e5d86-84cb-46e3-9fa2-7710b36448b9 req-5341e493-b30f-4706-85f0-1de1161e5ae8 service nova] Acquired lock "refresh_cache-d7719b11-cef7-4878-a693-24dcd085a1d7" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 970.928375] env[67977]: DEBUG nova.network.neutron [req-652e5d86-84cb-46e3-9fa2-7710b36448b9 req-5341e493-b30f-4706-85f0-1de1161e5ae8 service nova] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Refreshing network info cache for port 44e07e4c-c632-43af-be6c-38fd7f61490a {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 971.130499] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f4055fa8-ff89-4d6e-88cf-55ba3a1982ed tempest-ServerAddressesTestJSON-1824039280 tempest-ServerAddressesTestJSON-1824039280-project-member] Acquiring lock "d0df4be4-5fb6-4f3a-b2fb-695917fe2b7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 971.130745] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f4055fa8-ff89-4d6e-88cf-55ba3a1982ed tempest-ServerAddressesTestJSON-1824039280 tempest-ServerAddressesTestJSON-1824039280-project-member] Lock "d0df4be4-5fb6-4f3a-b2fb-695917fe2b7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 971.307048] env[67977]: DEBUG nova.network.neutron [req-652e5d86-84cb-46e3-9fa2-7710b36448b9 req-5341e493-b30f-4706-85f0-1de1161e5ae8 service nova] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Updated VIF entry in instance network info cache for port 44e07e4c-c632-43af-be6c-38fd7f61490a. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 971.307423] env[67977]: DEBUG nova.network.neutron [req-652e5d86-84cb-46e3-9fa2-7710b36448b9 req-5341e493-b30f-4706-85f0-1de1161e5ae8 service nova] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Updating instance_info_cache with network_info: [{"id": "44e07e4c-c632-43af-be6c-38fd7f61490a", "address": "fa:16:3e:7b:35:53", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44e07e4c-c6", "ovs_interfaceid": "44e07e4c-c632-43af-be6c-38fd7f61490a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 971.318526] env[67977]: DEBUG oslo_concurrency.lockutils [req-652e5d86-84cb-46e3-9fa2-7710b36448b9 req-5341e493-b30f-4706-85f0-1de1161e5ae8 service nova] Releasing lock "refresh_cache-d7719b11-cef7-4878-a693-24dcd085a1d7" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 972.175841] env[67977]: DEBUG oslo_concurrency.lockutils [None req-73346213-1e72-4300-a48a-34d4083e3167 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] Acquiring lock "50ffd9c5-6232-4c6a-ae8c-5492cdf07e32" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 972.176311] env[67977]: DEBUG oslo_concurrency.lockutils [None req-73346213-1e72-4300-a48a-34d4083e3167 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] Lock "50ffd9c5-6232-4c6a-ae8c-5492cdf07e32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 978.263335] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Acquiring lock "fd7c6688-4e12-4186-9235-b2ea93592dae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 978.263627] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Lock "fd7c6688-4e12-4186-9235-b2ea93592dae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 978.297475] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Acquiring lock "16c81008-fc75-4722-99c2-bfcdb3121d72" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 978.297738] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Lock "16c81008-fc75-4722-99c2-bfcdb3121d72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 991.211274] env[67977]: DEBUG oslo_concurrency.lockutils [None req-068e211b-dc93-4099-90ec-09b6259fe94c tempest-ServersNegativeTestMultiTenantJSON-1350597422 tempest-ServersNegativeTestMultiTenantJSON-1350597422-project-member] Acquiring lock "e8092e46-3f2e-4b1a-ad47-e8a5c16db13c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 991.211623] env[67977]: DEBUG oslo_concurrency.lockutils [None req-068e211b-dc93-4099-90ec-09b6259fe94c tempest-ServersNegativeTestMultiTenantJSON-1350597422 tempest-ServersNegativeTestMultiTenantJSON-1350597422-project-member] Lock "e8092e46-3f2e-4b1a-ad47-e8a5c16db13c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 992.138330] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c92e587b-96db-460b-9283-34bd62954090 tempest-AttachInterfacesUnderV243Test-1161791688 tempest-AttachInterfacesUnderV243Test-1161791688-project-member] Acquiring lock "3a02e857-fdf9-47fb-a464-7e3683c1ac93" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 992.138549] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c92e587b-96db-460b-9283-34bd62954090 tempest-AttachInterfacesUnderV243Test-1161791688 tempest-AttachInterfacesUnderV243Test-1161791688-project-member] Lock "3a02e857-fdf9-47fb-a464-7e3683c1ac93" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.480586] env[67977]: DEBUG oslo_concurrency.lockutils [None req-16e1c2c4-e4ac-4682-9230-9aef90ea05af tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "c2cc96a9-e755-4c6c-b34a-eb28c9c38066" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 999.481487] env[67977]: DEBUG oslo_concurrency.lockutils [None req-16e1c2c4-e4ac-4682-9230-9aef90ea05af tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "c2cc96a9-e755-4c6c-b34a-eb28c9c38066" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.594100] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3cb17c46-33f5-4b9d-b713-6c4fff757516 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] Acquiring lock "d69d2ea1-575c-4616-a3b4-5b8f381d1fa7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.594412] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3cb17c46-33f5-4b9d-b713-6c4fff757516 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] Lock "d69d2ea1-575c-4616-a3b4-5b8f381d1fa7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1002.943531] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b7769a77-5d12-45f6-8b6d-233831448ad2 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] Acquiring lock "1eb71176-689e-4a9d-8852-b289c3d1abbd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1002.943531] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b7769a77-5d12-45f6-8b6d-233831448ad2 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] Lock "1eb71176-689e-4a9d-8852-b289c3d1abbd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1007.175318] env[67977]: WARNING oslo_vmware.rw_handles [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1007.175318] env[67977]: ERROR oslo_vmware.rw_handles [ 1007.175946] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/fcfb2e7a-1e06-4788-9ae9-69790c54b02c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1007.177613] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1007.177886] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Copying Virtual Disk [datastore1] vmware_temp/fcfb2e7a-1e06-4788-9ae9-69790c54b02c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/fcfb2e7a-1e06-4788-9ae9-69790c54b02c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1007.178242] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a605a392-78d0-4ef2-a7f1-92170bbb7134 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.186376] env[67977]: DEBUG oslo_vmware.api [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Waiting for the task: (returnval){ [ 1007.186376] env[67977]: value = "task-3468171" [ 1007.186376] env[67977]: _type = "Task" [ 1007.186376] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1007.194714] env[67977]: DEBUG oslo_vmware.api [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Task: {'id': task-3468171, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1007.698196] env[67977]: DEBUG oslo_vmware.exceptions [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1007.698487] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1007.699187] env[67977]: ERROR nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1007.699187] env[67977]: Faults: ['InvalidArgument'] [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Traceback (most recent call last): [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] yield resources [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] self.driver.spawn(context, instance, image_meta, [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] self._fetch_image_if_missing(context, vi) [ 1007.699187] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] image_cache(vi, tmp_image_ds_loc) [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] vm_util.copy_virtual_disk( [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] session._wait_for_task(vmdk_copy_task) [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] return self.wait_for_task(task_ref) [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] return evt.wait() [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] result = hub.switch() [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1007.699591] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] return self.greenlet.switch() [ 1007.699965] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1007.699965] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] self.f(*self.args, **self.kw) [ 1007.699965] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1007.699965] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] raise exceptions.translate_fault(task_info.error) [ 1007.699965] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1007.699965] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Faults: ['InvalidArgument'] [ 1007.699965] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] [ 1007.699965] env[67977]: INFO nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Terminating instance [ 1007.700987] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1007.701212] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1007.701833] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1007.702031] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1007.702255] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e8c56ece-92e6-4366-95a4-8ccf11dddee7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.704579] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c410d9d-1c45-4e7b-a4fb-ebff9f1d8700 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.711821] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1007.712046] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-209b2a9d-dddd-4e22-b62b-77fc022656f9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.714264] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1007.714438] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1007.715403] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-98b5e006-6bcf-46b4-afbb-0d0f88988891 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.720154] env[67977]: DEBUG oslo_vmware.api [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Waiting for the task: (returnval){ [ 1007.720154] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ec9f84-ca40-df9d-4913-622f739b82a5" [ 1007.720154] env[67977]: _type = "Task" [ 1007.720154] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1007.727833] env[67977]: DEBUG oslo_vmware.api [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ec9f84-ca40-df9d-4913-622f739b82a5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1007.777920] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1007.778182] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1007.778453] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Deleting the datastore file [datastore1] faf24c4e-135e-47df-85a6-05024bc9b64b {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1007.778656] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4c64b8ca-ffb4-4884-b30c-aeb0427695cd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.786318] env[67977]: DEBUG oslo_vmware.api [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Waiting for the task: (returnval){ [ 1007.786318] env[67977]: value = "task-3468173" [ 1007.786318] env[67977]: _type = "Task" [ 1007.786318] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1007.799268] env[67977]: DEBUG oslo_vmware.api [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Task: {'id': task-3468173, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1008.231213] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1008.231604] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Creating directory with path [datastore1] vmware_temp/b8c4ffef-b6c2-478e-9f47-66931ce17a41/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1008.231713] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f2dc8895-1cc0-4a5d-936a-b209aeb1d0a7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.244262] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Created directory with path [datastore1] vmware_temp/b8c4ffef-b6c2-478e-9f47-66931ce17a41/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1008.244468] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Fetch image to [datastore1] vmware_temp/b8c4ffef-b6c2-478e-9f47-66931ce17a41/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1008.244640] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/b8c4ffef-b6c2-478e-9f47-66931ce17a41/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1008.245625] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f392ab6-2e00-4619-bfe1-3667c7bbcd20 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.252404] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8893b97c-b387-46a8-8473-95020b251875 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.261376] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7eea2aa-407d-45fe-883e-0b8d3c76ec9a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.298881] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d421f628-b547-40b9-94e2-17f56c60435b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.306560] env[67977]: DEBUG oslo_vmware.api [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Task: {'id': task-3468173, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070504} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1008.308341] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1008.308710] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1008.308926] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1008.309128] env[67977]: INFO nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1008.311189] env[67977]: DEBUG nova.compute.claims [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1008.311437] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1008.311726] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1008.314192] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-07f6f57a-a4ef-463a-9552-41f5ec146bcf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.337215] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1008.402171] env[67977]: DEBUG oslo_vmware.rw_handles [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b8c4ffef-b6c2-478e-9f47-66931ce17a41/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1008.472248] env[67977]: DEBUG oslo_vmware.rw_handles [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1008.473032] env[67977]: DEBUG oslo_vmware.rw_handles [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b8c4ffef-b6c2-478e-9f47-66931ce17a41/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1008.939147] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cb7ecd8-786d-44a7-9dcc-6984e0f5ad26 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.947689] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48b37320-2c4b-44bf-ad96-da140b45bbe9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.978244] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa7f3e0a-1116-4708-a78e-50e22648cad6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.986470] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11bc686a-8d5f-456a-abd5-352ee0eadf0a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.000428] env[67977]: DEBUG nova.compute.provider_tree [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1009.010674] env[67977]: DEBUG nova.scheduler.client.report [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1009.036139] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.724s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1009.036680] env[67977]: ERROR nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1009.036680] env[67977]: Faults: ['InvalidArgument'] [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Traceback (most recent call last): [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] self.driver.spawn(context, instance, image_meta, [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] self._fetch_image_if_missing(context, vi) [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] image_cache(vi, tmp_image_ds_loc) [ 1009.036680] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] vm_util.copy_virtual_disk( [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] session._wait_for_task(vmdk_copy_task) [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] return self.wait_for_task(task_ref) [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] return evt.wait() [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] result = hub.switch() [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] return self.greenlet.switch() [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1009.037107] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] self.f(*self.args, **self.kw) [ 1009.037477] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1009.037477] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] raise exceptions.translate_fault(task_info.error) [ 1009.037477] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1009.037477] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Faults: ['InvalidArgument'] [ 1009.037477] env[67977]: ERROR nova.compute.manager [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] [ 1009.037477] env[67977]: DEBUG nova.compute.utils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1009.039150] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Build of instance faf24c4e-135e-47df-85a6-05024bc9b64b was re-scheduled: A specified parameter was not correct: fileType [ 1009.039150] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1009.039588] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1009.039798] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1009.040023] env[67977]: DEBUG nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1009.040258] env[67977]: DEBUG nova.network.neutron [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1009.429265] env[67977]: DEBUG nova.network.neutron [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1009.443589] env[67977]: INFO nova.compute.manager [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Took 0.40 seconds to deallocate network for instance. [ 1009.553905] env[67977]: INFO nova.scheduler.client.report [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Deleted allocations for instance faf24c4e-135e-47df-85a6-05024bc9b64b [ 1009.590118] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8e28db8c-9056-49b3-83cd-bac3c02a2d3a tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "faf24c4e-135e-47df-85a6-05024bc9b64b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 293.477s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1009.590118] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "faf24c4e-135e-47df-85a6-05024bc9b64b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 95.462s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1009.590118] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Acquiring lock "faf24c4e-135e-47df-85a6-05024bc9b64b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1009.590321] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "faf24c4e-135e-47df-85a6-05024bc9b64b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1009.590321] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "faf24c4e-135e-47df-85a6-05024bc9b64b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1009.592826] env[67977]: INFO nova.compute.manager [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Terminating instance [ 1009.595275] env[67977]: DEBUG nova.compute.manager [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1009.595708] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1009.595827] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c605318c-fefa-4f2b-85c0-76ec087a5b6d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.604907] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7df0d8d6-e724-44ef-95cc-df125913345e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.619157] env[67977]: DEBUG nova.compute.manager [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] [instance: e7543070-519f-470d-b3dd-964b60ce149f] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1009.643449] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance faf24c4e-135e-47df-85a6-05024bc9b64b could not be found. [ 1009.643645] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1009.643825] env[67977]: INFO nova.compute.manager [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1009.644081] env[67977]: DEBUG oslo.service.loopingcall [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1009.644852] env[67977]: DEBUG nova.compute.manager [-] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1009.644965] env[67977]: DEBUG nova.network.neutron [-] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1009.646932] env[67977]: DEBUG nova.compute.manager [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] [instance: e7543070-519f-470d-b3dd-964b60ce149f] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1009.682522] env[67977]: DEBUG nova.network.neutron [-] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1009.684311] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Lock "e7543070-519f-470d-b3dd-964b60ce149f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.262s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1009.689785] env[67977]: INFO nova.compute.manager [-] [instance: faf24c4e-135e-47df-85a6-05024bc9b64b] Took 0.04 seconds to deallocate network for instance. [ 1009.700194] env[67977]: DEBUG nova.compute.manager [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] [instance: de7e2949-00a0-4ce7-9a54-c678d8722464] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1009.726207] env[67977]: DEBUG nova.compute.manager [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] [instance: de7e2949-00a0-4ce7-9a54-c678d8722464] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1009.774903] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7d1919dc-0cd0-45d5-8cf0-86a3ab8d2924 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Lock "de7e2949-00a0-4ce7-9a54-c678d8722464" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.298s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1009.785360] env[67977]: DEBUG nova.compute.manager [None req-23cf627c-ebfc-4435-8cda-1fe313b8c36d tempest-ServerTagsTestJSON-879612222 tempest-ServerTagsTestJSON-879612222-project-member] [instance: 4a59ec41-924b-4eb0-a025-4820479d535b] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1009.821507] env[67977]: DEBUG nova.compute.manager [None req-23cf627c-ebfc-4435-8cda-1fe313b8c36d tempest-ServerTagsTestJSON-879612222 tempest-ServerTagsTestJSON-879612222-project-member] [instance: 4a59ec41-924b-4eb0-a025-4820479d535b] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1009.846649] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bb55a2a2-9c3f-4273-855f-da6d5fa1c7cd tempest-FloatingIPsAssociationTestJSON-60131274 tempest-FloatingIPsAssociationTestJSON-60131274-project-member] Lock "faf24c4e-135e-47df-85a6-05024bc9b64b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.257s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1009.857013] env[67977]: DEBUG oslo_concurrency.lockutils [None req-23cf627c-ebfc-4435-8cda-1fe313b8c36d tempest-ServerTagsTestJSON-879612222 tempest-ServerTagsTestJSON-879612222-project-member] Lock "4a59ec41-924b-4eb0-a025-4820479d535b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.660s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1009.867979] env[67977]: DEBUG nova.compute.manager [None req-59c6d9d3-7fb4-45f9-9210-876f7af9fe8b tempest-ServersTestFqdnHostnames-1425710566 tempest-ServersTestFqdnHostnames-1425710566-project-member] [instance: 2a2e7c1d-af91-48c8-bbbf-3265d7407bb5] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1009.897667] env[67977]: DEBUG nova.compute.manager [None req-59c6d9d3-7fb4-45f9-9210-876f7af9fe8b tempest-ServersTestFqdnHostnames-1425710566 tempest-ServersTestFqdnHostnames-1425710566-project-member] [instance: 2a2e7c1d-af91-48c8-bbbf-3265d7407bb5] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1009.924792] env[67977]: DEBUG oslo_concurrency.lockutils [None req-59c6d9d3-7fb4-45f9-9210-876f7af9fe8b tempest-ServersTestFqdnHostnames-1425710566 tempest-ServersTestFqdnHostnames-1425710566-project-member] Lock "2a2e7c1d-af91-48c8-bbbf-3265d7407bb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.867s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1009.934547] env[67977]: DEBUG nova.compute.manager [None req-c3db2a6d-b2be-47da-8d68-cc9b85fd7cb1 tempest-ServerActionsTestJSON-853799719 tempest-ServerActionsTestJSON-853799719-project-member] [instance: e0c3bec9-6a83-4104-87db-673f90fb1247] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1009.967330] env[67977]: DEBUG nova.compute.manager [None req-c3db2a6d-b2be-47da-8d68-cc9b85fd7cb1 tempest-ServerActionsTestJSON-853799719 tempest-ServerActionsTestJSON-853799719-project-member] [instance: e0c3bec9-6a83-4104-87db-673f90fb1247] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1009.989648] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c3db2a6d-b2be-47da-8d68-cc9b85fd7cb1 tempest-ServerActionsTestJSON-853799719 tempest-ServerActionsTestJSON-853799719-project-member] Lock "e0c3bec9-6a83-4104-87db-673f90fb1247" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.527s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1010.005592] env[67977]: DEBUG nova.compute.manager [None req-9bc73d1f-a3df-4044-9749-fad3209d280d tempest-ServerActionsV293TestJSON-1366117977 tempest-ServerActionsV293TestJSON-1366117977-project-member] [instance: 05ea43b1-42c7-464b-89c9-b405f7ba20da] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1010.035873] env[67977]: DEBUG nova.compute.manager [None req-9bc73d1f-a3df-4044-9749-fad3209d280d tempest-ServerActionsV293TestJSON-1366117977 tempest-ServerActionsV293TestJSON-1366117977-project-member] [instance: 05ea43b1-42c7-464b-89c9-b405f7ba20da] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1010.061585] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9bc73d1f-a3df-4044-9749-fad3209d280d tempest-ServerActionsV293TestJSON-1366117977 tempest-ServerActionsV293TestJSON-1366117977-project-member] Lock "05ea43b1-42c7-464b-89c9-b405f7ba20da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.325s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1010.072370] env[67977]: DEBUG nova.compute.manager [None req-7913bbbb-62d1-4049-8a13-2a6ce94d16f8 tempest-ServersV294TestFqdnHostnames-690290236 tempest-ServersV294TestFqdnHostnames-690290236-project-member] [instance: 20642d86-67cd-41ee-ac01-d59fcb5d6243] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1010.103409] env[67977]: DEBUG nova.compute.manager [None req-7913bbbb-62d1-4049-8a13-2a6ce94d16f8 tempest-ServersV294TestFqdnHostnames-690290236 tempest-ServersV294TestFqdnHostnames-690290236-project-member] [instance: 20642d86-67cd-41ee-ac01-d59fcb5d6243] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1010.125706] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7913bbbb-62d1-4049-8a13-2a6ce94d16f8 tempest-ServersV294TestFqdnHostnames-690290236 tempest-ServersV294TestFqdnHostnames-690290236-project-member] Lock "20642d86-67cd-41ee-ac01-d59fcb5d6243" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.376s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1010.141015] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1010.216281] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1010.219162] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1010.219162] env[67977]: INFO nova.compute.claims [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1010.774076] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-337ed9b2-62b8-48d0-a100-19a56d2f4b16 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.784241] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65fe2a99-6c8c-4880-9083-89f2e8663010 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.820385] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ace1e55c-58fd-4a24-810e-fdaeb6513894 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.830795] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47928690-d6dc-4918-a4cf-2ff4cf5bedb7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.845898] env[67977]: DEBUG nova.compute.provider_tree [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1010.859082] env[67977]: DEBUG nova.scheduler.client.report [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1010.881399] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.665s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1010.882024] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1010.934595] env[67977]: DEBUG nova.compute.utils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1010.936177] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1010.938430] env[67977]: DEBUG nova.network.neutron [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1010.947822] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1011.034619] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1011.037927] env[67977]: DEBUG nova.policy [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dcf46486660c458f85cd933efbce304c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6a65d2de6d548c98012fcc2a632f287', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1011.064506] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1011.064742] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1011.064899] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1011.065100] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1011.065253] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1011.065401] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1011.065613] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1011.065788] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1011.065978] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1011.066161] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1011.066334] env[67977]: DEBUG nova.virt.hardware [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1011.067266] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-229ebec7-9afe-4bcd-a728-71ecdac48e25 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.080163] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2994ffb-8b05-48c5-ba04-2e89fb46eb24 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.444974] env[67977]: DEBUG nova.network.neutron [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Successfully created port: 4056a92b-a292-448c-ac84-1ae7812b8d13 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1012.712422] env[67977]: DEBUG nova.network.neutron [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Successfully updated port: 4056a92b-a292-448c-ac84-1ae7812b8d13 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1012.725508] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquiring lock "refresh_cache-6e2f1b5e-7bdc-463d-9822-810f99b81623" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1012.725652] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquired lock "refresh_cache-6e2f1b5e-7bdc-463d-9822-810f99b81623" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1012.725852] env[67977]: DEBUG nova.network.neutron [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1012.825396] env[67977]: DEBUG nova.network.neutron [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1013.079948] env[67977]: DEBUG nova.network.neutron [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Updating instance_info_cache with network_info: [{"id": "4056a92b-a292-448c-ac84-1ae7812b8d13", "address": "fa:16:3e:64:9e:f7", "network": {"id": "f737296a-6c8c-4012-9278-5d42029a9b69", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-614337774-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e6a65d2de6d548c98012fcc2a632f287", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ad8894f-e240-4013-8272-4e79daea0751", "external-id": "nsx-vlan-transportzone-204", "segmentation_id": 204, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4056a92b-a2", "ovs_interfaceid": "4056a92b-a292-448c-ac84-1ae7812b8d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1013.092475] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Releasing lock "refresh_cache-6e2f1b5e-7bdc-463d-9822-810f99b81623" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1013.092791] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Instance network_info: |[{"id": "4056a92b-a292-448c-ac84-1ae7812b8d13", "address": "fa:16:3e:64:9e:f7", "network": {"id": "f737296a-6c8c-4012-9278-5d42029a9b69", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-614337774-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e6a65d2de6d548c98012fcc2a632f287", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ad8894f-e240-4013-8272-4e79daea0751", "external-id": "nsx-vlan-transportzone-204", "segmentation_id": 204, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4056a92b-a2", "ovs_interfaceid": "4056a92b-a292-448c-ac84-1ae7812b8d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1013.093208] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:64:9e:f7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5ad8894f-e240-4013-8272-4e79daea0751', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4056a92b-a292-448c-ac84-1ae7812b8d13', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1013.101020] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Creating folder: Project (e6a65d2de6d548c98012fcc2a632f287). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1013.101598] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-689d2a69-f905-48d3-a631-5a013563a88c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.113730] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Created folder: Project (e6a65d2de6d548c98012fcc2a632f287) in parent group-v693022. [ 1013.113915] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Creating folder: Instances. Parent ref: group-v693074. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1013.114164] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d7cca214-a996-454a-b03c-3b2295e7f531 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.125036] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Created folder: Instances in parent group-v693074. [ 1013.125036] env[67977]: DEBUG oslo.service.loopingcall [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1013.125036] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1013.125036] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e86a3adf-dadc-4015-a7c2-b62c2cb61591 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.143438] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1013.143438] env[67977]: value = "task-3468176" [ 1013.143438] env[67977]: _type = "Task" [ 1013.143438] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1013.151409] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468176, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1013.245118] env[67977]: DEBUG nova.compute.manager [req-4592ae8c-e660-4a76-8e33-4eb3ce75b8ce req-f92f9bc2-57da-42e0-8a57-b491c469d6b8 service nova] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Received event network-vif-plugged-4056a92b-a292-448c-ac84-1ae7812b8d13 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1013.245118] env[67977]: DEBUG oslo_concurrency.lockutils [req-4592ae8c-e660-4a76-8e33-4eb3ce75b8ce req-f92f9bc2-57da-42e0-8a57-b491c469d6b8 service nova] Acquiring lock "6e2f1b5e-7bdc-463d-9822-810f99b81623-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1013.245118] env[67977]: DEBUG oslo_concurrency.lockutils [req-4592ae8c-e660-4a76-8e33-4eb3ce75b8ce req-f92f9bc2-57da-42e0-8a57-b491c469d6b8 service nova] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1013.245118] env[67977]: DEBUG oslo_concurrency.lockutils [req-4592ae8c-e660-4a76-8e33-4eb3ce75b8ce req-f92f9bc2-57da-42e0-8a57-b491c469d6b8 service nova] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1013.245363] env[67977]: DEBUG nova.compute.manager [req-4592ae8c-e660-4a76-8e33-4eb3ce75b8ce req-f92f9bc2-57da-42e0-8a57-b491c469d6b8 service nova] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] No waiting events found dispatching network-vif-plugged-4056a92b-a292-448c-ac84-1ae7812b8d13 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1013.245479] env[67977]: WARNING nova.compute.manager [req-4592ae8c-e660-4a76-8e33-4eb3ce75b8ce req-f92f9bc2-57da-42e0-8a57-b491c469d6b8 service nova] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Received unexpected event network-vif-plugged-4056a92b-a292-448c-ac84-1ae7812b8d13 for instance with vm_state building and task_state spawning. [ 1013.653899] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468176, 'name': CreateVM_Task, 'duration_secs': 0.33914} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1013.653899] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1013.655025] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1013.655243] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1013.656039] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1013.656039] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bedc39fb-93d1-4cee-bfa7-c83607589a52 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.660459] env[67977]: DEBUG oslo_vmware.api [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Waiting for the task: (returnval){ [ 1013.660459] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]522d1ba8-1b9b-01e3-d157-37547cd5bc56" [ 1013.660459] env[67977]: _type = "Task" [ 1013.660459] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1013.668360] env[67977]: DEBUG oslo_vmware.api [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]522d1ba8-1b9b-01e3-d157-37547cd5bc56, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1014.171656] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1014.171656] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1014.171656] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1014.580922] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquiring lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1014.581174] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.271594] env[67977]: DEBUG nova.compute.manager [req-12bfe8b5-b47f-4f64-86ca-3d24caeb4df6 req-5c8df4d7-d82c-4ce7-be29-cb8ac3458c74 service nova] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Received event network-changed-4056a92b-a292-448c-ac84-1ae7812b8d13 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1015.271594] env[67977]: DEBUG nova.compute.manager [req-12bfe8b5-b47f-4f64-86ca-3d24caeb4df6 req-5c8df4d7-d82c-4ce7-be29-cb8ac3458c74 service nova] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Refreshing instance network info cache due to event network-changed-4056a92b-a292-448c-ac84-1ae7812b8d13. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1015.271972] env[67977]: DEBUG oslo_concurrency.lockutils [req-12bfe8b5-b47f-4f64-86ca-3d24caeb4df6 req-5c8df4d7-d82c-4ce7-be29-cb8ac3458c74 service nova] Acquiring lock "refresh_cache-6e2f1b5e-7bdc-463d-9822-810f99b81623" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1015.271972] env[67977]: DEBUG oslo_concurrency.lockutils [req-12bfe8b5-b47f-4f64-86ca-3d24caeb4df6 req-5c8df4d7-d82c-4ce7-be29-cb8ac3458c74 service nova] Acquired lock "refresh_cache-6e2f1b5e-7bdc-463d-9822-810f99b81623" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1015.272266] env[67977]: DEBUG nova.network.neutron [req-12bfe8b5-b47f-4f64-86ca-3d24caeb4df6 req-5c8df4d7-d82c-4ce7-be29-cb8ac3458c74 service nova] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Refreshing network info cache for port 4056a92b-a292-448c-ac84-1ae7812b8d13 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1016.142065] env[67977]: DEBUG nova.network.neutron [req-12bfe8b5-b47f-4f64-86ca-3d24caeb4df6 req-5c8df4d7-d82c-4ce7-be29-cb8ac3458c74 service nova] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Updated VIF entry in instance network info cache for port 4056a92b-a292-448c-ac84-1ae7812b8d13. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1016.142435] env[67977]: DEBUG nova.network.neutron [req-12bfe8b5-b47f-4f64-86ca-3d24caeb4df6 req-5c8df4d7-d82c-4ce7-be29-cb8ac3458c74 service nova] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Updating instance_info_cache with network_info: [{"id": "4056a92b-a292-448c-ac84-1ae7812b8d13", "address": "fa:16:3e:64:9e:f7", "network": {"id": "f737296a-6c8c-4012-9278-5d42029a9b69", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-614337774-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e6a65d2de6d548c98012fcc2a632f287", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ad8894f-e240-4013-8272-4e79daea0751", "external-id": "nsx-vlan-transportzone-204", "segmentation_id": 204, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4056a92b-a2", "ovs_interfaceid": "4056a92b-a292-448c-ac84-1ae7812b8d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1016.156082] env[67977]: DEBUG oslo_concurrency.lockutils [req-12bfe8b5-b47f-4f64-86ca-3d24caeb4df6 req-5c8df4d7-d82c-4ce7-be29-cb8ac3458c74 service nova] Releasing lock "refresh_cache-6e2f1b5e-7bdc-463d-9822-810f99b81623" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1018.329498] env[67977]: DEBUG oslo_concurrency.lockutils [None req-db7a3000-5043-4cc2-94f5-2886aab67c87 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "c82c83e0-ae02-4e15-8f81-23e4ae8ecc5c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1018.329764] env[67977]: DEBUG oslo_concurrency.lockutils [None req-db7a3000-5043-4cc2-94f5-2886aab67c87 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "c82c83e0-ae02-4e15-8f81-23e4ae8ecc5c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1018.513506] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquiring lock "6e2f1b5e-7bdc-463d-9822-810f99b81623" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1025.803067] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1027.775866] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1027.775866] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1028.770595] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1028.775176] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1028.775176] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1028.775429] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1028.775518] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1028.775672] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1029.776554] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1029.787938] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1029.788484] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1029.788787] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1029.789071] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1029.790968] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0234d86-e681-4dfc-b9ae-4051092d4ca5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.800368] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47cb2f80-540f-4c9f-aadc-75bb5feeaab8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.815080] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6686d75b-c47a-4ce5-8de6-e8650e10b2c4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.821564] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fab98163-6034-436e-9ce1-04eac61607af {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.851090] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180918MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1029.851263] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1029.851472] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1029.925803] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 02dea9f7-00be-4305-909c-ab9245b60e1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.925803] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a2fd776e-9a01-4b67-bc23-1605d6e2b23e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.925912] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.925991] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.926138] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b22ae1a7-c9b8-464b-a81c-73144a0176be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.926267] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.926428] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.926527] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.926621] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.926781] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1029.938251] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1029.949410] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b8faf6c6-2531-44b9-8382-ddbc0feddf24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1029.959464] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1029.970813] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2a175b1b-e44c-4fd0-801d-445ba66a993c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1029.981996] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4aa9df28-0b4e-4aef-a647-cd1bd3b15c66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1029.992025] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 597679a6-42e4-4f77-af28-2eaca094f728 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.002010] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 04d4d42b-1ff6-4159-8a34-1b3549d127c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.011919] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4cfb8096-93f1-4400-bb3a-5a2af940532e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.021768] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a294e292-bc5a-4e79-8224-1cb8c201e81d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.031820] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.042616] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance cfc3a5c1-ec4a-41c6-911f-96bc0586d17e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.052476] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2977d1a5-655c-4dda-bd9e-3664770f3b62 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.081803] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d0df4be4-5fb6-4f3a-b2fb-695917fe2b7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.093029] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 50ffd9c5-6232-4c6a-ae8c-5492cdf07e32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.102625] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance fd7c6688-4e12-4186-9235-b2ea93592dae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.112882] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 16c81008-fc75-4722-99c2-bfcdb3121d72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.123929] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e8092e46-3f2e-4b1a-ad47-e8a5c16db13c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.137286] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a02e857-fdf9-47fb-a464-7e3683c1ac93 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.148628] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance c2cc96a9-e755-4c6c-b34a-eb28c9c38066 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.157807] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d69d2ea1-575c-4616-a3b4-5b8f381d1fa7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.167677] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1eb71176-689e-4a9d-8852-b289c3d1abbd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.177576] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.187978] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance c82c83e0-ae02-4e15-8f81-23e4ae8ecc5c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1030.188233] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1030.188409] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1030.541203] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0974e82-35bb-4cc2-b915-1732903a3572 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.549108] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea9520bd-b14d-4699-a6ea-d35d13d28f54 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.580440] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0688f720-019f-4b81-ae05-c3bed5c8f6ca {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.587765] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b43e0fb-555f-4553-a5f2-d3592938ac51 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.601115] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1030.610187] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1030.624951] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1030.625206] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.774s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1032.625456] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1032.625690] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1032.625778] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1032.646436] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.646629] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.646768] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.646899] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.647036] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.647183] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.647287] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.647410] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.647529] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.647647] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.647767] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1057.362228] env[67977]: WARNING oslo_vmware.rw_handles [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1057.362228] env[67977]: ERROR oslo_vmware.rw_handles [ 1057.362728] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/b8c4ffef-b6c2-478e-9f47-66931ce17a41/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1057.364564] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1057.364799] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Copying Virtual Disk [datastore1] vmware_temp/b8c4ffef-b6c2-478e-9f47-66931ce17a41/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/b8c4ffef-b6c2-478e-9f47-66931ce17a41/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1057.365107] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2e059a0f-00e3-4ac0-bfa3-31b8eb60351d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.374258] env[67977]: DEBUG oslo_vmware.api [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Waiting for the task: (returnval){ [ 1057.374258] env[67977]: value = "task-3468177" [ 1057.374258] env[67977]: _type = "Task" [ 1057.374258] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1057.382341] env[67977]: DEBUG oslo_vmware.api [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Task: {'id': task-3468177, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1057.886110] env[67977]: DEBUG oslo_vmware.exceptions [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1057.886415] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1057.886972] env[67977]: ERROR nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1057.886972] env[67977]: Faults: ['InvalidArgument'] [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Traceback (most recent call last): [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] yield resources [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] self.driver.spawn(context, instance, image_meta, [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] self._fetch_image_if_missing(context, vi) [ 1057.886972] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] image_cache(vi, tmp_image_ds_loc) [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] vm_util.copy_virtual_disk( [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] session._wait_for_task(vmdk_copy_task) [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] return self.wait_for_task(task_ref) [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] return evt.wait() [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] result = hub.switch() [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1057.887363] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] return self.greenlet.switch() [ 1057.887695] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1057.887695] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] self.f(*self.args, **self.kw) [ 1057.887695] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1057.887695] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] raise exceptions.translate_fault(task_info.error) [ 1057.887695] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1057.887695] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Faults: ['InvalidArgument'] [ 1057.887695] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] [ 1057.887695] env[67977]: INFO nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Terminating instance [ 1057.888963] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1057.889201] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1057.889456] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6d10869d-4913-437f-9380-008d0e44a17b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.891883] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1057.892105] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1057.892866] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f30a096-c494-4f65-92c5-ed266ef2dfb8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.900655] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1057.900911] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-533329d4-1b81-4ab6-9e4a-cac4aa90c2c1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.903498] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1057.903677] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1057.904671] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-232354b4-5577-48b2-9e52-c163dac54d7e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.909692] env[67977]: DEBUG oslo_vmware.api [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Waiting for the task: (returnval){ [ 1057.909692] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]522d70fd-331f-0173-f468-43cc39d7dc2e" [ 1057.909692] env[67977]: _type = "Task" [ 1057.909692] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1057.925850] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1057.926091] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Creating directory with path [datastore1] vmware_temp/022ff49e-6e87-4ecf-bf51-e75f27408c2a/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1057.926344] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6a499a2e-2c05-474c-99ae-43acd9fde8be {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.948644] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Created directory with path [datastore1] vmware_temp/022ff49e-6e87-4ecf-bf51-e75f27408c2a/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1057.948842] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Fetch image to [datastore1] vmware_temp/022ff49e-6e87-4ecf-bf51-e75f27408c2a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1057.949082] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/022ff49e-6e87-4ecf-bf51-e75f27408c2a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1057.949936] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57d63fb7-ca99-461d-9e07-12768179c5cf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.957601] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd301bf9-551c-4497-ad3e-7020d4572f81 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.967522] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f74f9aa2-0bd6-4a45-a806-fdb1e7d06549 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.002942] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-926b60b4-32bc-45c9-ad4c-a8cacc25c257 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.005669] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1058.005877] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1058.006058] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Deleting the datastore file [datastore1] 02dea9f7-00be-4305-909c-ab9245b60e1d {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1058.006308] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c15d17c5-2077-48f9-aa67-720dc56fc878 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.012071] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-be94235f-93ba-45da-bbff-a3b759b9a9ee {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.015125] env[67977]: DEBUG oslo_vmware.api [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Waiting for the task: (returnval){ [ 1058.015125] env[67977]: value = "task-3468179" [ 1058.015125] env[67977]: _type = "Task" [ 1058.015125] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1058.026088] env[67977]: DEBUG oslo_vmware.api [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Task: {'id': task-3468179, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1058.040025] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1058.119423] env[67977]: DEBUG oslo_vmware.rw_handles [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/022ff49e-6e87-4ecf-bf51-e75f27408c2a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1058.184076] env[67977]: DEBUG oslo_vmware.rw_handles [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1058.184076] env[67977]: DEBUG oslo_vmware.rw_handles [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/022ff49e-6e87-4ecf-bf51-e75f27408c2a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1058.526548] env[67977]: DEBUG oslo_vmware.api [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Task: {'id': task-3468179, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077925} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1058.527662] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1058.527662] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1058.527662] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1058.527828] env[67977]: INFO nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1058.530095] env[67977]: DEBUG nova.compute.claims [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1058.530306] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1058.530528] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.009990] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e06232b-b16a-444d-90c3-3007ace71944 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.019440] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d95f910f-101e-4ee8-b4ab-fc19d32755ad {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.052326] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2392d8e4-e2ab-4477-81c1-f7f1eb3b1b14 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.060393] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d1356b5-dd45-4f14-b2e5-f2f566080b39 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.074014] env[67977]: DEBUG nova.compute.provider_tree [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1059.084186] env[67977]: DEBUG nova.scheduler.client.report [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1059.102478] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.572s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1059.103067] env[67977]: ERROR nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1059.103067] env[67977]: Faults: ['InvalidArgument'] [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Traceback (most recent call last): [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] self.driver.spawn(context, instance, image_meta, [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] self._fetch_image_if_missing(context, vi) [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] image_cache(vi, tmp_image_ds_loc) [ 1059.103067] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] vm_util.copy_virtual_disk( [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] session._wait_for_task(vmdk_copy_task) [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] return self.wait_for_task(task_ref) [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] return evt.wait() [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] result = hub.switch() [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] return self.greenlet.switch() [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1059.103602] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] self.f(*self.args, **self.kw) [ 1059.104157] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1059.104157] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] raise exceptions.translate_fault(task_info.error) [ 1059.104157] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1059.104157] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Faults: ['InvalidArgument'] [ 1059.104157] env[67977]: ERROR nova.compute.manager [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] [ 1059.104157] env[67977]: DEBUG nova.compute.utils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1059.106818] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Build of instance 02dea9f7-00be-4305-909c-ab9245b60e1d was re-scheduled: A specified parameter was not correct: fileType [ 1059.106818] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1059.107317] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1059.107421] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1059.107589] env[67977]: DEBUG nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1059.107752] env[67977]: DEBUG nova.network.neutron [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1059.482630] env[67977]: DEBUG nova.network.neutron [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1059.491967] env[67977]: INFO nova.compute.manager [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Took 0.38 seconds to deallocate network for instance. [ 1059.592667] env[67977]: INFO nova.scheduler.client.report [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Deleted allocations for instance 02dea9f7-00be-4305-909c-ab9245b60e1d [ 1059.619083] env[67977]: DEBUG oslo_concurrency.lockutils [None req-599db2d3-ffdb-46e2-91b0-9f72fa8f24a3 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "02dea9f7-00be-4305-909c-ab9245b60e1d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 343.296s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1059.621152] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "02dea9f7-00be-4305-909c-ab9245b60e1d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 146.228s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.621152] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Acquiring lock "02dea9f7-00be-4305-909c-ab9245b60e1d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1059.621152] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "02dea9f7-00be-4305-909c-ab9245b60e1d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.621366] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "02dea9f7-00be-4305-909c-ab9245b60e1d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1059.622984] env[67977]: INFO nova.compute.manager [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Terminating instance [ 1059.627673] env[67977]: DEBUG nova.compute.manager [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1059.627883] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1059.628373] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-997e97c4-85eb-41cd-b73b-a2ac0e72ada4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.637552] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdf8cc2b-39b3-4b45-9a1b-865e5314e897 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.648379] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1059.669233] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 02dea9f7-00be-4305-909c-ab9245b60e1d could not be found. [ 1059.669447] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1059.669622] env[67977]: INFO nova.compute.manager [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1059.669869] env[67977]: DEBUG oslo.service.loopingcall [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1059.670156] env[67977]: DEBUG nova.compute.manager [-] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1059.670266] env[67977]: DEBUG nova.network.neutron [-] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1059.698030] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1059.698030] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.699390] env[67977]: INFO nova.compute.claims [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1059.702097] env[67977]: DEBUG nova.network.neutron [-] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1059.713054] env[67977]: INFO nova.compute.manager [-] [instance: 02dea9f7-00be-4305-909c-ab9245b60e1d] Took 0.04 seconds to deallocate network for instance. [ 1059.822586] env[67977]: DEBUG oslo_concurrency.lockutils [None req-7422366a-f83e-4c85-86fc-14a0587b26e0 tempest-ImagesOneServerTestJSON-30322406 tempest-ImagesOneServerTestJSON-30322406-project-member] Lock "02dea9f7-00be-4305-909c-ab9245b60e1d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.202s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.156286] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b31c870-43e6-4150-a714-831d815102b5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.164043] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af445de0-d490-4c8f-bbc4-cde02dd2b0eb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.194317] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98947bb3-5430-448d-acd7-e374c4805cef {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.201836] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98a723ad-cdc4-47f5-b5a3-a496c7d4b52e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.218296] env[67977]: DEBUG nova.compute.provider_tree [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1060.226635] env[67977]: DEBUG nova.scheduler.client.report [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1060.250242] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.552s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.250757] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1060.287044] env[67977]: DEBUG nova.compute.utils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1060.288271] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1060.288440] env[67977]: DEBUG nova.network.neutron [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1060.299328] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1060.376020] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1060.401299] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1060.401549] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1060.401782] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1060.401994] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1060.402151] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1060.402300] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1060.402507] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1060.402667] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1060.402834] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1060.403035] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1060.403230] env[67977]: DEBUG nova.virt.hardware [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1060.404306] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fbdf63f-acce-46f3-81dc-c6dfe479b358 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.412836] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb3e1e81-5d54-4250-b7fd-df687af24e82 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.417990] env[67977]: DEBUG nova.policy [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '731126c839df46a2ba21487512491f5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '48b5b81492f04f0eaffa947e6bde7284', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1061.087824] env[67977]: DEBUG nova.network.neutron [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Successfully created port: 9f054cc5-b8bb-4056-b2ec-d92e45be8aac {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1062.351541] env[67977]: DEBUG nova.network.neutron [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Successfully updated port: 9f054cc5-b8bb-4056-b2ec-d92e45be8aac {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1062.366052] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquiring lock "refresh_cache-eae30b17-eea5-46aa-bb09-91ebca29ea6d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1062.366162] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquired lock "refresh_cache-eae30b17-eea5-46aa-bb09-91ebca29ea6d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1062.366301] env[67977]: DEBUG nova.network.neutron [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1062.422613] env[67977]: DEBUG nova.network.neutron [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1062.437199] env[67977]: DEBUG nova.compute.manager [req-3817c25e-cd4e-4834-9fef-ef0577661747 req-fa5c4c38-590e-4d99-8614-749ee3a87e74 service nova] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Received event network-vif-plugged-9f054cc5-b8bb-4056-b2ec-d92e45be8aac {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1062.437199] env[67977]: DEBUG oslo_concurrency.lockutils [req-3817c25e-cd4e-4834-9fef-ef0577661747 req-fa5c4c38-590e-4d99-8614-749ee3a87e74 service nova] Acquiring lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1062.437330] env[67977]: DEBUG oslo_concurrency.lockutils [req-3817c25e-cd4e-4834-9fef-ef0577661747 req-fa5c4c38-590e-4d99-8614-749ee3a87e74 service nova] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1062.437418] env[67977]: DEBUG oslo_concurrency.lockutils [req-3817c25e-cd4e-4834-9fef-ef0577661747 req-fa5c4c38-590e-4d99-8614-749ee3a87e74 service nova] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1062.437588] env[67977]: DEBUG nova.compute.manager [req-3817c25e-cd4e-4834-9fef-ef0577661747 req-fa5c4c38-590e-4d99-8614-749ee3a87e74 service nova] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] No waiting events found dispatching network-vif-plugged-9f054cc5-b8bb-4056-b2ec-d92e45be8aac {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1062.437751] env[67977]: WARNING nova.compute.manager [req-3817c25e-cd4e-4834-9fef-ef0577661747 req-fa5c4c38-590e-4d99-8614-749ee3a87e74 service nova] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Received unexpected event network-vif-plugged-9f054cc5-b8bb-4056-b2ec-d92e45be8aac for instance with vm_state building and task_state spawning. [ 1062.654295] env[67977]: DEBUG nova.network.neutron [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Updating instance_info_cache with network_info: [{"id": "9f054cc5-b8bb-4056-b2ec-d92e45be8aac", "address": "fa:16:3e:df:dc:0d", "network": {"id": "5fae390a-0dc1-407c-b892-a94f13fd55cf", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1721834356-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "48b5b81492f04f0eaffa947e6bde7284", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b80dd748-3d7e-4a23-a38d-9e79a3881452", "external-id": "nsx-vlan-transportzone-497", "segmentation_id": 497, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9f054cc5-b8", "ovs_interfaceid": "9f054cc5-b8bb-4056-b2ec-d92e45be8aac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1062.669820] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Releasing lock "refresh_cache-eae30b17-eea5-46aa-bb09-91ebca29ea6d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1062.670134] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Instance network_info: |[{"id": "9f054cc5-b8bb-4056-b2ec-d92e45be8aac", "address": "fa:16:3e:df:dc:0d", "network": {"id": "5fae390a-0dc1-407c-b892-a94f13fd55cf", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1721834356-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "48b5b81492f04f0eaffa947e6bde7284", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b80dd748-3d7e-4a23-a38d-9e79a3881452", "external-id": "nsx-vlan-transportzone-497", "segmentation_id": 497, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9f054cc5-b8", "ovs_interfaceid": "9f054cc5-b8bb-4056-b2ec-d92e45be8aac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1062.670537] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:df:dc:0d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b80dd748-3d7e-4a23-a38d-9e79a3881452', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9f054cc5-b8bb-4056-b2ec-d92e45be8aac', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1062.680046] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Creating folder: Project (48b5b81492f04f0eaffa947e6bde7284). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1062.680238] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-87bf93d4-482b-484d-87bc-95d1950edacc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.694398] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Created folder: Project (48b5b81492f04f0eaffa947e6bde7284) in parent group-v693022. [ 1062.694398] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Creating folder: Instances. Parent ref: group-v693077. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1062.694398] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f1d1ee66-f3fd-444a-a874-ad96160eef16 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.704422] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Created folder: Instances in parent group-v693077. [ 1062.704422] env[67977]: DEBUG oslo.service.loopingcall [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1062.704422] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1062.704422] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-88a57eae-26cf-4d2d-abdb-105868b17c22 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.737208] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1062.737208] env[67977]: value = "task-3468182" [ 1062.737208] env[67977]: _type = "Task" [ 1062.737208] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1062.745726] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468182, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1063.249259] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468182, 'name': CreateVM_Task, 'duration_secs': 0.342767} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1063.249466] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1063.250438] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1063.250661] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1063.251114] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1063.251650] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2e41fa6e-ac2a-4049-bf9c-2aeaf7bf6dee {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.256905] env[67977]: DEBUG oslo_vmware.api [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Waiting for the task: (returnval){ [ 1063.256905] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5233363f-3104-7197-5cef-6ba177ae538b" [ 1063.256905] env[67977]: _type = "Task" [ 1063.256905] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1063.265238] env[67977]: DEBUG oslo_vmware.api [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5233363f-3104-7197-5cef-6ba177ae538b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1063.767880] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1063.768236] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1063.768377] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1064.493138] env[67977]: DEBUG nova.compute.manager [req-9bb10a3c-9aa7-4579-a7d4-fcb24f3fd569 req-5707bb88-bdd0-4ebb-abb9-805fa134ed8a service nova] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Received event network-changed-9f054cc5-b8bb-4056-b2ec-d92e45be8aac {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1064.493346] env[67977]: DEBUG nova.compute.manager [req-9bb10a3c-9aa7-4579-a7d4-fcb24f3fd569 req-5707bb88-bdd0-4ebb-abb9-805fa134ed8a service nova] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Refreshing instance network info cache due to event network-changed-9f054cc5-b8bb-4056-b2ec-d92e45be8aac. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1064.493649] env[67977]: DEBUG oslo_concurrency.lockutils [req-9bb10a3c-9aa7-4579-a7d4-fcb24f3fd569 req-5707bb88-bdd0-4ebb-abb9-805fa134ed8a service nova] Acquiring lock "refresh_cache-eae30b17-eea5-46aa-bb09-91ebca29ea6d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1064.493822] env[67977]: DEBUG oslo_concurrency.lockutils [req-9bb10a3c-9aa7-4579-a7d4-fcb24f3fd569 req-5707bb88-bdd0-4ebb-abb9-805fa134ed8a service nova] Acquired lock "refresh_cache-eae30b17-eea5-46aa-bb09-91ebca29ea6d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1064.494035] env[67977]: DEBUG nova.network.neutron [req-9bb10a3c-9aa7-4579-a7d4-fcb24f3fd569 req-5707bb88-bdd0-4ebb-abb9-805fa134ed8a service nova] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Refreshing network info cache for port 9f054cc5-b8bb-4056-b2ec-d92e45be8aac {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1064.845459] env[67977]: DEBUG nova.network.neutron [req-9bb10a3c-9aa7-4579-a7d4-fcb24f3fd569 req-5707bb88-bdd0-4ebb-abb9-805fa134ed8a service nova] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Updated VIF entry in instance network info cache for port 9f054cc5-b8bb-4056-b2ec-d92e45be8aac. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1064.845874] env[67977]: DEBUG nova.network.neutron [req-9bb10a3c-9aa7-4579-a7d4-fcb24f3fd569 req-5707bb88-bdd0-4ebb-abb9-805fa134ed8a service nova] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Updating instance_info_cache with network_info: [{"id": "9f054cc5-b8bb-4056-b2ec-d92e45be8aac", "address": "fa:16:3e:df:dc:0d", "network": {"id": "5fae390a-0dc1-407c-b892-a94f13fd55cf", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1721834356-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "48b5b81492f04f0eaffa947e6bde7284", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b80dd748-3d7e-4a23-a38d-9e79a3881452", "external-id": "nsx-vlan-transportzone-497", "segmentation_id": 497, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9f054cc5-b8", "ovs_interfaceid": "9f054cc5-b8bb-4056-b2ec-d92e45be8aac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1064.855655] env[67977]: DEBUG oslo_concurrency.lockutils [req-9bb10a3c-9aa7-4579-a7d4-fcb24f3fd569 req-5707bb88-bdd0-4ebb-abb9-805fa134ed8a service nova] Releasing lock "refresh_cache-eae30b17-eea5-46aa-bb09-91ebca29ea6d" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1065.929664] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquiring lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1069.917323] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "8f1440c5-e712-4635-9f02-f9cda12da693" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1069.917632] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8f1440c5-e712-4635-9f02-f9cda12da693" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1088.775812] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1088.776091] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1088.776256] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1089.771412] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1089.775066] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1089.775261] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1089.775420] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1090.775965] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1091.774990] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1091.788525] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1091.788525] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1091.788904] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1091.788904] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1091.790286] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03ec54b0-7d06-469e-b678-6701698bf6d7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1091.800035] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dd663a4-8f43-42cc-bbde-1d6bdb3478dc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1091.814694] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05b2f159-eac5-4e45-9520-ddbd17194611 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1091.821332] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04182e2e-c926-49f2-b4c2-c53d54ed7635 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1091.852785] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180946MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1091.852785] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1091.852785] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1091.928327] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a2fd776e-9a01-4b67-bc23-1605d6e2b23e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.928497] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.928755] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.928894] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b22ae1a7-c9b8-464b-a81c-73144a0176be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.929026] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.929152] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.929271] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.929385] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.929500] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.929637] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1091.941622] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b8faf6c6-2531-44b9-8382-ddbc0feddf24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1091.953191] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1091.965067] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2a175b1b-e44c-4fd0-801d-445ba66a993c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1091.976950] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4aa9df28-0b4e-4aef-a647-cd1bd3b15c66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1091.986950] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 597679a6-42e4-4f77-af28-2eaca094f728 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1091.997106] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 04d4d42b-1ff6-4159-8a34-1b3549d127c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.007945] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 4cfb8096-93f1-4400-bb3a-5a2af940532e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.018788] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a294e292-bc5a-4e79-8224-1cb8c201e81d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.029008] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.038608] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance cfc3a5c1-ec4a-41c6-911f-96bc0586d17e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.048716] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2977d1a5-655c-4dda-bd9e-3664770f3b62 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.058807] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d0df4be4-5fb6-4f3a-b2fb-695917fe2b7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.068292] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 50ffd9c5-6232-4c6a-ae8c-5492cdf07e32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.077657] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance fd7c6688-4e12-4186-9235-b2ea93592dae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.087884] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 16c81008-fc75-4722-99c2-bfcdb3121d72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.097156] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e8092e46-3f2e-4b1a-ad47-e8a5c16db13c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.106152] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a02e857-fdf9-47fb-a464-7e3683c1ac93 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.116098] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance c2cc96a9-e755-4c6c-b34a-eb28c9c38066 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.128469] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d69d2ea1-575c-4616-a3b4-5b8f381d1fa7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.137832] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1eb71176-689e-4a9d-8852-b289c3d1abbd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.148376] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.158037] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance c82c83e0-ae02-4e15-8f81-23e4ae8ecc5c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.167754] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1092.168603] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1092.168603] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1092.539478] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-744e47ce-1e18-476e-baac-ac9b54c0b768 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.547032] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56fc9cac-9ec6-47c2-b606-d2db089fa310 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.576479] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6340b19c-fc6e-4597-a3b0-b7165b3403d8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.583799] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0482638-bcfc-4c65-a46d-6e2693704a47 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.596472] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1092.604969] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1092.618158] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1092.619027] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.767s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1094.620020] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1094.620020] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1094.620314] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1094.648913] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.649152] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.649301] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.649431] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.649588] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.649723] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.649846] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.649964] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.650091] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.650208] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.650326] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1104.975693] env[67977]: WARNING oslo_vmware.rw_handles [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1104.975693] env[67977]: ERROR oslo_vmware.rw_handles [ 1104.976363] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/022ff49e-6e87-4ecf-bf51-e75f27408c2a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1104.978256] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1104.978560] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Copying Virtual Disk [datastore1] vmware_temp/022ff49e-6e87-4ecf-bf51-e75f27408c2a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/022ff49e-6e87-4ecf-bf51-e75f27408c2a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1104.978845] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-db0df98b-0b8e-492c-845a-f6167c1dac25 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.986253] env[67977]: DEBUG oslo_vmware.api [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Waiting for the task: (returnval){ [ 1104.986253] env[67977]: value = "task-3468183" [ 1104.986253] env[67977]: _type = "Task" [ 1104.986253] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1104.994559] env[67977]: DEBUG oslo_vmware.api [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Task: {'id': task-3468183, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1105.497076] env[67977]: DEBUG oslo_vmware.exceptions [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1105.497397] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1105.498013] env[67977]: ERROR nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1105.498013] env[67977]: Faults: ['InvalidArgument'] [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Traceback (most recent call last): [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] yield resources [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] self.driver.spawn(context, instance, image_meta, [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] self._fetch_image_if_missing(context, vi) [ 1105.498013] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] image_cache(vi, tmp_image_ds_loc) [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] vm_util.copy_virtual_disk( [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] session._wait_for_task(vmdk_copy_task) [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] return self.wait_for_task(task_ref) [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] return evt.wait() [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] result = hub.switch() [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1105.498331] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] return self.greenlet.switch() [ 1105.498654] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1105.498654] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] self.f(*self.args, **self.kw) [ 1105.498654] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1105.498654] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] raise exceptions.translate_fault(task_info.error) [ 1105.498654] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1105.498654] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Faults: ['InvalidArgument'] [ 1105.498654] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] [ 1105.498654] env[67977]: INFO nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Terminating instance [ 1105.500024] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1105.500269] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1105.500535] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9a443140-b296-4cc6-b8f0-aebbc29c5d2b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.503086] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1105.503322] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1105.504118] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40c89ec6-e0c7-4fd6-b0ff-b4d98968ec88 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.510850] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1105.511089] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d82ae283-a4f1-429a-bbb3-20ba0503ece6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.513371] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1105.513565] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1105.514577] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e4baedd8-f9e8-40f2-b8c8-2bfab1f9bf5e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.520052] env[67977]: DEBUG oslo_vmware.api [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for the task: (returnval){ [ 1105.520052] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52331b6f-51f2-30bb-9300-3ed4bc54cf57" [ 1105.520052] env[67977]: _type = "Task" [ 1105.520052] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1105.529281] env[67977]: DEBUG oslo_vmware.api [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52331b6f-51f2-30bb-9300-3ed4bc54cf57, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1105.583587] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1105.583802] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1105.583995] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Deleting the datastore file [datastore1] a2fd776e-9a01-4b67-bc23-1605d6e2b23e {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1105.584252] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6acd993b-8c22-4cbf-a627-e60b103b2048 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.590055] env[67977]: DEBUG oslo_vmware.api [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Waiting for the task: (returnval){ [ 1105.590055] env[67977]: value = "task-3468185" [ 1105.590055] env[67977]: _type = "Task" [ 1105.590055] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1105.597722] env[67977]: DEBUG oslo_vmware.api [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Task: {'id': task-3468185, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1106.029741] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1106.030042] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Creating directory with path [datastore1] vmware_temp/b34a2013-2a17-446e-a23f-a1bdc5948437/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1106.030238] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dc7068a2-9dd1-42b6-b8e8-5f6367a3d01e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.041747] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Created directory with path [datastore1] vmware_temp/b34a2013-2a17-446e-a23f-a1bdc5948437/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1106.042413] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Fetch image to [datastore1] vmware_temp/b34a2013-2a17-446e-a23f-a1bdc5948437/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1106.042413] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/b34a2013-2a17-446e-a23f-a1bdc5948437/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1106.043129] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca8fd84f-e7e4-4a25-ada6-8be16efd1526 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.050158] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f274517e-e242-4f6b-8e48-52923f707012 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.059215] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79ecdcc7-79ad-4786-8b56-8fcbd66c48c2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.094130] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f79924aa-ae42-4599-9539-d86c7f61509f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.101373] env[67977]: DEBUG oslo_vmware.api [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Task: {'id': task-3468185, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082725} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1106.102826] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1106.103914] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1106.103914] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1106.103914] env[67977]: INFO nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1106.105112] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7fcb12b3-5923-46c9-86cc-4c606304dd79 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.107075] env[67977]: DEBUG nova.compute.claims [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1106.107254] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1106.107464] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1106.137831] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1106.212158] env[67977]: DEBUG oslo_vmware.rw_handles [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b34a2013-2a17-446e-a23f-a1bdc5948437/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1106.276949] env[67977]: DEBUG oslo_vmware.rw_handles [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1106.277198] env[67977]: DEBUG oslo_vmware.rw_handles [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b34a2013-2a17-446e-a23f-a1bdc5948437/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1106.621123] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c1703b2-fe62-4fa1-921b-a52ad7ff703c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.629277] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92631b24-f81e-4abc-b3cb-ebba69f4327c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.658846] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c78ff6a1-1fbe-441e-97e5-4faf8410957f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.666243] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2d19ec7-8673-4e63-b9a6-45b9449a9faa {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.680616] env[67977]: DEBUG nova.compute.provider_tree [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1106.690178] env[67977]: DEBUG nova.scheduler.client.report [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1106.707128] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.599s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.707716] env[67977]: ERROR nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1106.707716] env[67977]: Faults: ['InvalidArgument'] [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Traceback (most recent call last): [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] self.driver.spawn(context, instance, image_meta, [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] self._fetch_image_if_missing(context, vi) [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] image_cache(vi, tmp_image_ds_loc) [ 1106.707716] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] vm_util.copy_virtual_disk( [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] session._wait_for_task(vmdk_copy_task) [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] return self.wait_for_task(task_ref) [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] return evt.wait() [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] result = hub.switch() [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] return self.greenlet.switch() [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1106.708088] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] self.f(*self.args, **self.kw) [ 1106.708494] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1106.708494] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] raise exceptions.translate_fault(task_info.error) [ 1106.708494] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1106.708494] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Faults: ['InvalidArgument'] [ 1106.708494] env[67977]: ERROR nova.compute.manager [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] [ 1106.708494] env[67977]: DEBUG nova.compute.utils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1106.710401] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Build of instance a2fd776e-9a01-4b67-bc23-1605d6e2b23e was re-scheduled: A specified parameter was not correct: fileType [ 1106.710401] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1106.710799] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1106.710973] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1106.711158] env[67977]: DEBUG nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1106.711349] env[67977]: DEBUG nova.network.neutron [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1107.119317] env[67977]: DEBUG nova.network.neutron [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1107.134431] env[67977]: INFO nova.compute.manager [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Took 0.42 seconds to deallocate network for instance. [ 1107.252836] env[67977]: INFO nova.scheduler.client.report [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Deleted allocations for instance a2fd776e-9a01-4b67-bc23-1605d6e2b23e [ 1107.291302] env[67977]: DEBUG oslo_concurrency.lockutils [None req-625524dd-e8fa-4a0b-9ae9-efbe220c8ae1 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 382.291s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.291553] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 183.693s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.291927] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Acquiring lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1107.292044] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.292147] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.294306] env[67977]: INFO nova.compute.manager [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Terminating instance [ 1107.295987] env[67977]: DEBUG nova.compute.manager [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1107.296199] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1107.296679] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0b0fb074-f24b-458b-b783-d21f8f3e4399 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.306459] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6fcaae0-eaf4-41bf-bcdd-b9b9b06e00b8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.317635] env[67977]: DEBUG nova.compute.manager [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1107.338752] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a2fd776e-9a01-4b67-bc23-1605d6e2b23e could not be found. [ 1107.339041] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1107.339298] env[67977]: INFO nova.compute.manager [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1107.339634] env[67977]: DEBUG oslo.service.loopingcall [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1107.339929] env[67977]: DEBUG nova.compute.manager [-] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1107.340070] env[67977]: DEBUG nova.network.neutron [-] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1107.374767] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1107.374767] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.377695] env[67977]: INFO nova.compute.claims [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1107.381321] env[67977]: DEBUG nova.network.neutron [-] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1107.390402] env[67977]: INFO nova.compute.manager [-] [instance: a2fd776e-9a01-4b67-bc23-1605d6e2b23e] Took 0.05 seconds to deallocate network for instance. [ 1107.504799] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1f587971-e65f-4df2-b73d-5737e1333567 tempest-InstanceActionsTestJSON-2041163630 tempest-InstanceActionsTestJSON-2041163630-project-member] Lock "a2fd776e-9a01-4b67-bc23-1605d6e2b23e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.213s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.771484] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "b8faf6c6-2531-44b9-8382-ddbc0feddf24" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1107.918068] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23b9426f-cc84-4b71-a4fb-0498df8f2392 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.925557] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5426e83c-0810-4937-9e2a-ba112e3b034a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.958061] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56c740c0-225c-4c02-8339-15b6ec86b4bb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.967056] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1caac1a-60dc-4e3a-9154-20782180bc5c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.981573] env[67977]: DEBUG nova.compute.provider_tree [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1107.991040] env[67977]: DEBUG nova.scheduler.client.report [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1108.008418] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.634s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1108.008944] env[67977]: DEBUG nova.compute.manager [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1108.048064] env[67977]: DEBUG nova.compute.claims [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1108.048064] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1108.048064] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1108.516294] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bc64dbf-b4d3-4b5b-b739-2bcea2e9bf09 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.527709] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5aa8f9c-0c2c-4742-9b4e-bca8da57c89d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.557493] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8555f2a2-2055-43fb-befb-3078fb0bddbf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.566159] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a04063c-e7fd-4dfa-81e6-3cc0f0c7a59a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.579181] env[67977]: DEBUG nova.compute.provider_tree [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1108.589277] env[67977]: DEBUG nova.scheduler.client.report [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1108.603273] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.555s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1108.604036] env[67977]: DEBUG nova.compute.utils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Conflict updating instance b8faf6c6-2531-44b9-8382-ddbc0feddf24. Expected: {'task_state': [None]}. Actual: {'task_state': 'deleting'} {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1108.605414] env[67977]: DEBUG nova.compute.manager [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Instance disappeared during build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2487}} [ 1108.605597] env[67977]: DEBUG nova.compute.manager [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1108.605809] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "refresh_cache-b8faf6c6-2531-44b9-8382-ddbc0feddf24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1108.605954] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquired lock "refresh_cache-b8faf6c6-2531-44b9-8382-ddbc0feddf24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1108.606228] env[67977]: DEBUG nova.network.neutron [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1108.638777] env[67977]: DEBUG nova.network.neutron [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1108.725657] env[67977]: DEBUG nova.network.neutron [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1108.736626] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Releasing lock "refresh_cache-b8faf6c6-2531-44b9-8382-ddbc0feddf24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1108.736875] env[67977]: DEBUG nova.compute.manager [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1108.737052] env[67977]: DEBUG nova.compute.manager [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1108.737222] env[67977]: DEBUG nova.network.neutron [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1108.753925] env[67977]: DEBUG nova.network.neutron [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1108.762682] env[67977]: DEBUG nova.network.neutron [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1108.773239] env[67977]: INFO nova.compute.manager [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Took 0.04 seconds to deallocate network for instance. [ 1108.858869] env[67977]: INFO nova.scheduler.client.report [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Deleted allocations for instance b8faf6c6-2531-44b9-8382-ddbc0feddf24 [ 1108.859190] env[67977]: DEBUG oslo_concurrency.lockutils [None req-88df518d-ca48-4f2e-bc35-abe07ad7ddf9 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "b8faf6c6-2531-44b9-8382-ddbc0feddf24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.939s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1108.860309] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "b8faf6c6-2531-44b9-8382-ddbc0feddf24" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 1.089s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1108.860534] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "b8faf6c6-2531-44b9-8382-ddbc0feddf24-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1108.860750] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "b8faf6c6-2531-44b9-8382-ddbc0feddf24-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1108.860921] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "b8faf6c6-2531-44b9-8382-ddbc0feddf24-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1108.862938] env[67977]: INFO nova.compute.manager [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Terminating instance [ 1108.864486] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquiring lock "refresh_cache-b8faf6c6-2531-44b9-8382-ddbc0feddf24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1108.864741] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Acquired lock "refresh_cache-b8faf6c6-2531-44b9-8382-ddbc0feddf24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1108.865064] env[67977]: DEBUG nova.network.neutron [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1108.869939] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1108.896095] env[67977]: DEBUG nova.network.neutron [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1108.938077] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1108.938344] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1108.939897] env[67977]: INFO nova.compute.claims [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1108.985191] env[67977]: DEBUG nova.network.neutron [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1108.998032] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Releasing lock "refresh_cache-b8faf6c6-2531-44b9-8382-ddbc0feddf24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1108.998666] env[67977]: DEBUG nova.compute.manager [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1108.999018] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1109.006307] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-856c41b9-c328-4a1a-ab47-4730cd2831fa {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.011485] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e0d9a77-39e7-42ac-91f3-a3e72d5db886 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.045472] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b8faf6c6-2531-44b9-8382-ddbc0feddf24 could not be found. [ 1109.045472] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1109.045472] env[67977]: INFO nova.compute.manager [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1109.045472] env[67977]: DEBUG oslo.service.loopingcall [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1109.048541] env[67977]: DEBUG nova.compute.manager [-] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1109.048541] env[67977]: DEBUG nova.network.neutron [-] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1109.070466] env[67977]: DEBUG nova.network.neutron [-] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1109.082921] env[67977]: DEBUG nova.network.neutron [-] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1109.097250] env[67977]: INFO nova.compute.manager [-] [instance: b8faf6c6-2531-44b9-8382-ddbc0feddf24] Took 0.05 seconds to deallocate network for instance. [ 1109.200577] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba78e970-2314-465b-b763-9f81181a2ef2 tempest-DeleteServersAdminTestJSON-917492072 tempest-DeleteServersAdminTestJSON-917492072-project-member] Lock "b8faf6c6-2531-44b9-8382-ddbc0feddf24" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.340s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1109.464947] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fff9745-3622-4681-be69-0a44c51b4f53 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.472749] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b9c8693-ecf6-4f22-8dee-1f57cfac62b9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.504608] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b30da168-591c-476e-a0eb-3ce4b6596973 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.512252] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c3332d7-5186-44e0-990c-7705fc79131c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.525531] env[67977]: DEBUG nova.compute.provider_tree [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1109.534938] env[67977]: DEBUG nova.scheduler.client.report [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1109.551465] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.613s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1109.551966] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1109.593020] env[67977]: DEBUG nova.compute.utils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1109.594593] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1109.594801] env[67977]: DEBUG nova.network.neutron [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1109.605397] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1109.700404] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1109.730435] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:59:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a2808893-5856-4f0b-bdb3-e7b2ba170e85',id=32,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1352576169',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1109.730707] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1109.730862] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1109.731056] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1109.731230] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1109.731404] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1109.731605] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1109.731790] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1109.731952] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1109.732125] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1109.732302] env[67977]: DEBUG nova.virt.hardware [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1109.733377] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8013f002-9c75-4e82-9d85-6df736f76a2c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.738998] env[67977]: DEBUG nova.policy [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '68d28aef3d664f2f9393af566e537026', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '361fd2b44307439fafff108df64f2f77', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1109.745203] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74341eff-6e01-42f3-a999-804b01505099 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.266073] env[67977]: DEBUG nova.network.neutron [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Successfully created port: 9d547183-60c9-4963-af12-b7a033774218 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1111.278356] env[67977]: DEBUG nova.compute.manager [req-f02357d0-f5de-4f73-b32b-b1da3f954bcf req-c533502c-d0f4-4c1e-a713-324b255083e2 service nova] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Received event network-vif-plugged-9d547183-60c9-4963-af12-b7a033774218 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1111.278644] env[67977]: DEBUG oslo_concurrency.lockutils [req-f02357d0-f5de-4f73-b32b-b1da3f954bcf req-c533502c-d0f4-4c1e-a713-324b255083e2 service nova] Acquiring lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1111.278855] env[67977]: DEBUG oslo_concurrency.lockutils [req-f02357d0-f5de-4f73-b32b-b1da3f954bcf req-c533502c-d0f4-4c1e-a713-324b255083e2 service nova] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1111.279040] env[67977]: DEBUG oslo_concurrency.lockutils [req-f02357d0-f5de-4f73-b32b-b1da3f954bcf req-c533502c-d0f4-4c1e-a713-324b255083e2 service nova] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1111.279232] env[67977]: DEBUG nova.compute.manager [req-f02357d0-f5de-4f73-b32b-b1da3f954bcf req-c533502c-d0f4-4c1e-a713-324b255083e2 service nova] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] No waiting events found dispatching network-vif-plugged-9d547183-60c9-4963-af12-b7a033774218 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1111.279485] env[67977]: WARNING nova.compute.manager [req-f02357d0-f5de-4f73-b32b-b1da3f954bcf req-c533502c-d0f4-4c1e-a713-324b255083e2 service nova] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Received unexpected event network-vif-plugged-9d547183-60c9-4963-af12-b7a033774218 for instance with vm_state building and task_state spawning. [ 1111.433019] env[67977]: DEBUG nova.network.neutron [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Successfully updated port: 9d547183-60c9-4963-af12-b7a033774218 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1111.454760] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquiring lock "refresh_cache-665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1111.456896] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquired lock "refresh_cache-665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1111.456896] env[67977]: DEBUG nova.network.neutron [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1111.514313] env[67977]: DEBUG nova.network.neutron [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1111.778154] env[67977]: DEBUG nova.network.neutron [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Updating instance_info_cache with network_info: [{"id": "9d547183-60c9-4963-af12-b7a033774218", "address": "fa:16:3e:4c:6f:56", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.113", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9d547183-60", "ovs_interfaceid": "9d547183-60c9-4963-af12-b7a033774218", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1111.797456] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Releasing lock "refresh_cache-665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1111.797772] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Instance network_info: |[{"id": "9d547183-60c9-4963-af12-b7a033774218", "address": "fa:16:3e:4c:6f:56", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.113", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9d547183-60", "ovs_interfaceid": "9d547183-60c9-4963-af12-b7a033774218", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1111.798177] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4c:6f:56', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dbd2870d-a51d-472a-8034-1b3e132b5cb6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9d547183-60c9-4963-af12-b7a033774218', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1111.805958] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Creating folder: Project (361fd2b44307439fafff108df64f2f77). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1111.806863] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-736d3eab-fb2d-4f9c-b688-14098104413a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.819831] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Created folder: Project (361fd2b44307439fafff108df64f2f77) in parent group-v693022. [ 1111.820166] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Creating folder: Instances. Parent ref: group-v693080. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1111.820528] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-617bec09-ce24-4440-b668-ba28a9f6c7d1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.829067] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Created folder: Instances in parent group-v693080. [ 1111.829304] env[67977]: DEBUG oslo.service.loopingcall [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1111.829520] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1111.829729] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0770eb7c-7a3a-4c75-b17e-28784e32eed5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.848808] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1111.848808] env[67977]: value = "task-3468188" [ 1111.848808] env[67977]: _type = "Task" [ 1111.848808] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1111.856732] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468188, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1112.359629] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468188, 'name': CreateVM_Task, 'duration_secs': 0.298858} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1112.360139] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1112.360507] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1112.360678] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1112.361258] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1112.361534] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4f2dc87-7edd-4f2c-8a79-a5c7bd7514f7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.366693] env[67977]: DEBUG oslo_vmware.api [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Waiting for the task: (returnval){ [ 1112.366693] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523fc46a-389f-fbd3-3b32-7084ddd9e074" [ 1112.366693] env[67977]: _type = "Task" [ 1112.366693] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1112.375353] env[67977]: DEBUG oslo_vmware.api [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523fc46a-389f-fbd3-3b32-7084ddd9e074, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1112.878678] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1112.879826] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1112.880335] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1113.492511] env[67977]: DEBUG nova.compute.manager [req-fe76c390-64fb-4c92-bf30-dd246c6b9e50 req-1dd2afe2-f5e2-406d-9faf-5af2f835e018 service nova] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Received event network-changed-9d547183-60c9-4963-af12-b7a033774218 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1113.492773] env[67977]: DEBUG nova.compute.manager [req-fe76c390-64fb-4c92-bf30-dd246c6b9e50 req-1dd2afe2-f5e2-406d-9faf-5af2f835e018 service nova] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Refreshing instance network info cache due to event network-changed-9d547183-60c9-4963-af12-b7a033774218. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1113.492920] env[67977]: DEBUG oslo_concurrency.lockutils [req-fe76c390-64fb-4c92-bf30-dd246c6b9e50 req-1dd2afe2-f5e2-406d-9faf-5af2f835e018 service nova] Acquiring lock "refresh_cache-665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1113.493075] env[67977]: DEBUG oslo_concurrency.lockutils [req-fe76c390-64fb-4c92-bf30-dd246c6b9e50 req-1dd2afe2-f5e2-406d-9faf-5af2f835e018 service nova] Acquired lock "refresh_cache-665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1113.493239] env[67977]: DEBUG nova.network.neutron [req-fe76c390-64fb-4c92-bf30-dd246c6b9e50 req-1dd2afe2-f5e2-406d-9faf-5af2f835e018 service nova] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Refreshing network info cache for port 9d547183-60c9-4963-af12-b7a033774218 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1114.109066] env[67977]: DEBUG nova.network.neutron [req-fe76c390-64fb-4c92-bf30-dd246c6b9e50 req-1dd2afe2-f5e2-406d-9faf-5af2f835e018 service nova] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Updated VIF entry in instance network info cache for port 9d547183-60c9-4963-af12-b7a033774218. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1114.109537] env[67977]: DEBUG nova.network.neutron [req-fe76c390-64fb-4c92-bf30-dd246c6b9e50 req-1dd2afe2-f5e2-406d-9faf-5af2f835e018 service nova] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Updating instance_info_cache with network_info: [{"id": "9d547183-60c9-4963-af12-b7a033774218", "address": "fa:16:3e:4c:6f:56", "network": {"id": "be5f44ec-a95e-455a-b653-0b739080e6d8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.113", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "504bd5d983344574a09ccb5d9ee0ab47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9d547183-60", "ovs_interfaceid": "9d547183-60c9-4963-af12-b7a033774218", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1114.119009] env[67977]: DEBUG oslo_concurrency.lockutils [req-fe76c390-64fb-4c92-bf30-dd246c6b9e50 req-1dd2afe2-f5e2-406d-9faf-5af2f835e018 service nova] Releasing lock "refresh_cache-665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1115.997291] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "f03fe248-75df-4237-a6dd-cc49012c2331" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1115.997291] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "f03fe248-75df-4237-a6dd-cc49012c2331" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1131.881427] env[67977]: DEBUG oslo_concurrency.lockutils [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.329686] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fa2ab9d9-0aa9-4f0c-a48e-bb0c1ec47dba tempest-ServerShowV254Test-1567276119 tempest-ServerShowV254Test-1567276119-project-member] Acquiring lock "2eb73f59-f6c3-4816-b545-9ab391697785" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.330037] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fa2ab9d9-0aa9-4f0c-a48e-bb0c1ec47dba tempest-ServerShowV254Test-1567276119 tempest-ServerShowV254Test-1567276119-project-member] Lock "2eb73f59-f6c3-4816-b545-9ab391697785" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1140.978013] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquiring lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1145.049317] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ac88ae81-04db-4474-8c3e-4da498e6468e tempest-ServerGroupTestJSON-1257420963 tempest-ServerGroupTestJSON-1257420963-project-member] Acquiring lock "206998ab-992a-4bc0-b176-eb490b3cb479" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1145.049317] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ac88ae81-04db-4474-8c3e-4da498e6468e tempest-ServerGroupTestJSON-1257420963 tempest-ServerGroupTestJSON-1257420963-project-member] Lock "206998ab-992a-4bc0-b176-eb490b3cb479" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1145.801400] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1148.051531] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6ddf2ff5-da05-4281-aa65-ed713584cc3e tempest-ServerMetadataTestJSON-580247377 tempest-ServerMetadataTestJSON-580247377-project-member] Acquiring lock "f66d73a3-535f-45ba-a9c4-2436da6ea511" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1148.051846] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6ddf2ff5-da05-4281-aa65-ed713584cc3e tempest-ServerMetadataTestJSON-580247377 tempest-ServerMetadataTestJSON-580247377-project-member] Lock "f66d73a3-535f-45ba-a9c4-2436da6ea511" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1148.775455] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1149.775580] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1149.776029] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1149.776029] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1150.772409] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1150.774899] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1150.774899] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1151.262929] env[67977]: DEBUG oslo_concurrency.lockutils [None req-a19c358d-95ea-4bb3-8813-d105ebab962c tempest-ServerRescueTestJSON-961321 tempest-ServerRescueTestJSON-961321-project-member] Acquiring lock "120b8d8f-3f0e-4cb4-a112-695d6365788a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1151.263292] env[67977]: DEBUG oslo_concurrency.lockutils [None req-a19c358d-95ea-4bb3-8813-d105ebab962c tempest-ServerRescueTestJSON-961321 tempest-ServerRescueTestJSON-961321-project-member] Lock "120b8d8f-3f0e-4cb4-a112-695d6365788a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1151.776930] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1152.775430] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1152.787380] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1152.787572] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1152.787734] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.787887] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1152.789084] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-480f0b86-175b-4c6b-879d-90d22621625d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.797851] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9416012b-9145-4d0c-b2d4-05277ac6481b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.811530] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f7041ca-5afd-4599-bad3-c8532acf8685 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.817604] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2889c18-f46a-41b7-b805-88681d80dc7c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.845589] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180930MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1152.845726] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1152.845912] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1152.918685] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.918844] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.918974] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b22ae1a7-c9b8-464b-a81c-73144a0176be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.919115] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.919258] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.919405] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.919528] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.919646] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.919763] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.919878] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1152.931546] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1152.942729] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance cfc3a5c1-ec4a-41c6-911f-96bc0586d17e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1152.955330] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2977d1a5-655c-4dda-bd9e-3664770f3b62 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1152.966870] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d0df4be4-5fb6-4f3a-b2fb-695917fe2b7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1152.976658] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 50ffd9c5-6232-4c6a-ae8c-5492cdf07e32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1152.981047] env[67977]: WARNING oslo_vmware.rw_handles [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1152.981047] env[67977]: ERROR oslo_vmware.rw_handles [ 1152.981438] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/b34a2013-2a17-446e-a23f-a1bdc5948437/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1152.983588] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1152.983861] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Copying Virtual Disk [datastore1] vmware_temp/b34a2013-2a17-446e-a23f-a1bdc5948437/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/b34a2013-2a17-446e-a23f-a1bdc5948437/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1152.984169] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-367b05d6-ef93-4c4b-beb7-da7f494fd90b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.987628] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance fd7c6688-4e12-4186-9235-b2ea93592dae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1152.993794] env[67977]: DEBUG oslo_vmware.api [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for the task: (returnval){ [ 1152.993794] env[67977]: value = "task-3468189" [ 1152.993794] env[67977]: _type = "Task" [ 1152.993794] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1152.999289] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 16c81008-fc75-4722-99c2-bfcdb3121d72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.004974] env[67977]: DEBUG oslo_vmware.api [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': task-3468189, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1153.012675] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e8092e46-3f2e-4b1a-ad47-e8a5c16db13c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.023048] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 3a02e857-fdf9-47fb-a464-7e3683c1ac93 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.035399] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance c2cc96a9-e755-4c6c-b34a-eb28c9c38066 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.045180] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d69d2ea1-575c-4616-a3b4-5b8f381d1fa7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.054149] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1eb71176-689e-4a9d-8852-b289c3d1abbd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.063178] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.072077] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance c82c83e0-ae02-4e15-8f81-23e4ae8ecc5c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.080652] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.089741] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.098641] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2eb73f59-f6c3-4816-b545-9ab391697785 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.107254] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 206998ab-992a-4bc0-b176-eb490b3cb479 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.116942] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f66d73a3-535f-45ba-a9c4-2436da6ea511 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.126207] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 120b8d8f-3f0e-4cb4-a112-695d6365788a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1153.126447] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1153.126592] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1153.457741] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcbbf5a1-1267-46a8-ac40-8f02a9d17bbb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.465480] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01d11d29-4d87-43b7-b948-7fd39c3da8bb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.498555] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e05c1200-cfa8-4956-80a4-516d154a0a13 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.506079] env[67977]: DEBUG oslo_vmware.exceptions [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1153.508194] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1153.508775] env[67977]: ERROR nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1153.508775] env[67977]: Faults: ['InvalidArgument'] [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Traceback (most recent call last): [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] yield resources [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] self.driver.spawn(context, instance, image_meta, [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] self._fetch_image_if_missing(context, vi) [ 1153.508775] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] image_cache(vi, tmp_image_ds_loc) [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] vm_util.copy_virtual_disk( [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] session._wait_for_task(vmdk_copy_task) [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] return self.wait_for_task(task_ref) [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] return evt.wait() [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] result = hub.switch() [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1153.509107] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] return self.greenlet.switch() [ 1153.509431] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1153.509431] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] self.f(*self.args, **self.kw) [ 1153.509431] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1153.509431] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] raise exceptions.translate_fault(task_info.error) [ 1153.509431] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1153.509431] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Faults: ['InvalidArgument'] [ 1153.509431] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] [ 1153.509431] env[67977]: INFO nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Terminating instance [ 1153.510662] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1153.511074] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1153.512064] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11d88fbd-6e08-42c8-b117-62b05e06d1c9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.516054] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1153.516253] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1153.516467] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1df6baed-ea8a-4f32-a640-c188244cc93a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.518698] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05febd8d-f6ef-46c0-ae57-679ce3f21aa2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.530387] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1153.535137] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1153.535137] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1153.535137] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1153.535473] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-adb80485-833c-4362-97a9-c66d922a4948 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.536929] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-06573b35-3a6a-4ea6-a734-807f7cc81286 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.541679] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1153.545695] env[67977]: DEBUG oslo_vmware.api [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for the task: (returnval){ [ 1153.545695] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52219f2e-f4d0-f10b-00b2-e2a3cd6ede01" [ 1153.545695] env[67977]: _type = "Task" [ 1153.545695] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1153.553671] env[67977]: DEBUG oslo_vmware.api [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52219f2e-f4d0-f10b-00b2-e2a3cd6ede01, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1153.558769] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1153.558950] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.713s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1153.610856] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1153.611097] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1153.611275] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Deleting the datastore file [datastore1] 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1153.611541] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-994e9bec-dc27-454a-85ab-8f5dffa46f79 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.617912] env[67977]: DEBUG oslo_vmware.api [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for the task: (returnval){ [ 1153.617912] env[67977]: value = "task-3468191" [ 1153.617912] env[67977]: _type = "Task" [ 1153.617912] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1153.626046] env[67977]: DEBUG oslo_vmware.api [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': task-3468191, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1154.056762] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1154.057094] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Creating directory with path [datastore1] vmware_temp/bf3a851c-9e08-487b-805e-bc65888b80db/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1154.057257] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-16266111-c213-4920-b605-3ab811f4515a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.069187] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Created directory with path [datastore1] vmware_temp/bf3a851c-9e08-487b-805e-bc65888b80db/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1154.069362] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Fetch image to [datastore1] vmware_temp/bf3a851c-9e08-487b-805e-bc65888b80db/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1154.069547] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/bf3a851c-9e08-487b-805e-bc65888b80db/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1154.070295] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dd3ad25-98ab-4c4a-b650-007dbcd63228 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.077060] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52b5c61d-cbec-4d9e-ba3e-e529f1324834 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.086565] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95c4e171-003e-4ab4-8d3d-8278aa92ed14 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.117424] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3dda3a0-4bcc-4ae4-b75f-2cf9a6202bdf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.128160] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-51053cd0-bca4-4ea2-8251-398282d39e2f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.129847] env[67977]: DEBUG oslo_vmware.api [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': task-3468191, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068051} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1154.130101] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1154.130285] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1154.130459] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1154.130648] env[67977]: INFO nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1154.132713] env[67977]: DEBUG nova.compute.claims [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1154.132903] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1154.133144] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1154.154268] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1154.205884] env[67977]: DEBUG oslo_vmware.rw_handles [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bf3a851c-9e08-487b-805e-bc65888b80db/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1154.264928] env[67977]: DEBUG oslo_vmware.rw_handles [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1154.265180] env[67977]: DEBUG oslo_vmware.rw_handles [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bf3a851c-9e08-487b-805e-bc65888b80db/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1154.567133] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e95a32ef-499c-407f-b2a7-f7139a23491c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.574801] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53483f68-4ea3-419e-ad20-4ac04ac85e75 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.604250] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-670189e7-bdc3-4422-8370-a14e55b31218 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.611119] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2375e81e-8e88-424e-b9a6-cb5388156512 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.624656] env[67977]: DEBUG nova.compute.provider_tree [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1154.633339] env[67977]: DEBUG nova.scheduler.client.report [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1154.647101] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.514s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1154.647596] env[67977]: ERROR nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1154.647596] env[67977]: Faults: ['InvalidArgument'] [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Traceback (most recent call last): [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] self.driver.spawn(context, instance, image_meta, [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] self._fetch_image_if_missing(context, vi) [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] image_cache(vi, tmp_image_ds_loc) [ 1154.647596] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] vm_util.copy_virtual_disk( [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] session._wait_for_task(vmdk_copy_task) [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] return self.wait_for_task(task_ref) [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] return evt.wait() [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] result = hub.switch() [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] return self.greenlet.switch() [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1154.647918] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] self.f(*self.args, **self.kw) [ 1154.648272] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1154.648272] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] raise exceptions.translate_fault(task_info.error) [ 1154.648272] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1154.648272] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Faults: ['InvalidArgument'] [ 1154.648272] env[67977]: ERROR nova.compute.manager [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] [ 1154.648405] env[67977]: DEBUG nova.compute.utils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1154.649742] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Build of instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf was re-scheduled: A specified parameter was not correct: fileType [ 1154.649742] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1154.650129] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1154.650306] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1154.650477] env[67977]: DEBUG nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1154.650656] env[67977]: DEBUG nova.network.neutron [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1155.007614] env[67977]: DEBUG nova.network.neutron [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1155.019361] env[67977]: INFO nova.compute.manager [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Took 0.37 seconds to deallocate network for instance. [ 1155.173079] env[67977]: INFO nova.scheduler.client.report [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Deleted allocations for instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf [ 1155.200869] env[67977]: DEBUG oslo_concurrency.lockutils [None req-881f1e23-dea8-48b1-aaf2-7302d749d9bc tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 425.118s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.200869] env[67977]: DEBUG oslo_concurrency.lockutils [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 23.317s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1155.200869] env[67977]: DEBUG oslo_concurrency.lockutils [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1155.201020] env[67977]: DEBUG oslo_concurrency.lockutils [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1155.201020] env[67977]: DEBUG oslo_concurrency.lockutils [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.203882] env[67977]: INFO nova.compute.manager [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Terminating instance [ 1155.206441] env[67977]: DEBUG nova.compute.manager [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1155.206823] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1155.207300] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8b73fe75-2303-49c8-bcbf-466a635c2881 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.221018] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f8b5167-2fc2-41a3-a2dc-2b0588cb2c70 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.234578] env[67977]: DEBUG nova.compute.manager [None req-b7a53d53-481b-45af-a107-3a3ee04cbab0 tempest-ServersAaction247Test-1483982511 tempest-ServersAaction247Test-1483982511-project-member] [instance: 2a175b1b-e44c-4fd0-801d-445ba66a993c] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1155.258741] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf could not be found. [ 1155.259142] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1155.259448] env[67977]: INFO nova.compute.manager [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1155.259816] env[67977]: DEBUG oslo.service.loopingcall [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1155.260176] env[67977]: DEBUG nova.compute.manager [-] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1155.260379] env[67977]: DEBUG nova.network.neutron [-] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1155.262955] env[67977]: DEBUG nova.compute.manager [None req-b7a53d53-481b-45af-a107-3a3ee04cbab0 tempest-ServersAaction247Test-1483982511 tempest-ServersAaction247Test-1483982511-project-member] [instance: 2a175b1b-e44c-4fd0-801d-445ba66a993c] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1155.286302] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b7a53d53-481b-45af-a107-3a3ee04cbab0 tempest-ServersAaction247Test-1483982511 tempest-ServersAaction247Test-1483982511-project-member] Lock "2a175b1b-e44c-4fd0-801d-445ba66a993c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.102s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.293717] env[67977]: DEBUG nova.network.neutron [-] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1155.299100] env[67977]: DEBUG nova.compute.manager [None req-25e4148e-24fd-409e-a4b4-c0dad5de3c41 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] [instance: 4aa9df28-0b4e-4aef-a647-cd1bd3b15c66] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1155.303274] env[67977]: INFO nova.compute.manager [-] [instance: 3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf] Took 0.04 seconds to deallocate network for instance. [ 1155.321714] env[67977]: DEBUG nova.compute.manager [None req-25e4148e-24fd-409e-a4b4-c0dad5de3c41 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] [instance: 4aa9df28-0b4e-4aef-a647-cd1bd3b15c66] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1155.347788] env[67977]: DEBUG oslo_concurrency.lockutils [None req-25e4148e-24fd-409e-a4b4-c0dad5de3c41 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] Lock "4aa9df28-0b4e-4aef-a647-cd1bd3b15c66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.389s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.358392] env[67977]: DEBUG nova.compute.manager [None req-4fa4a161-da96-45e1-87a9-8ed3231d2db8 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] [instance: 597679a6-42e4-4f77-af28-2eaca094f728] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1155.396332] env[67977]: DEBUG nova.compute.manager [None req-4fa4a161-da96-45e1-87a9-8ed3231d2db8 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] [instance: 597679a6-42e4-4f77-af28-2eaca094f728] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1155.415319] env[67977]: DEBUG oslo_concurrency.lockutils [None req-55724314-f36f-4778-ba11-67b7955d334d tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "3a614fc1-0f0e-43d7-9ff0-6c0fb343cebf" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.217s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.425647] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4fa4a161-da96-45e1-87a9-8ed3231d2db8 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] Lock "597679a6-42e4-4f77-af28-2eaca094f728" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.497s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.436551] env[67977]: DEBUG nova.compute.manager [None req-b43826e5-ee66-41b0-abe3-9993c21ea7f8 tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] [instance: 04d4d42b-1ff6-4159-8a34-1b3549d127c5] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1155.461306] env[67977]: DEBUG nova.compute.manager [None req-b43826e5-ee66-41b0-abe3-9993c21ea7f8 tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] [instance: 04d4d42b-1ff6-4159-8a34-1b3549d127c5] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1155.486580] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b43826e5-ee66-41b0-abe3-9993c21ea7f8 tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] Lock "04d4d42b-1ff6-4159-8a34-1b3549d127c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.179s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.497121] env[67977]: DEBUG nova.compute.manager [None req-2796c75f-eca8-45c4-bf85-a28a4edd1357 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] [instance: 4cfb8096-93f1-4400-bb3a-5a2af940532e] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1155.523716] env[67977]: DEBUG nova.compute.manager [None req-2796c75f-eca8-45c4-bf85-a28a4edd1357 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] [instance: 4cfb8096-93f1-4400-bb3a-5a2af940532e] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1155.551783] env[67977]: DEBUG oslo_concurrency.lockutils [None req-2796c75f-eca8-45c4-bf85-a28a4edd1357 tempest-ListServerFiltersTestJSON-579618102 tempest-ListServerFiltersTestJSON-579618102-project-member] Lock "4cfb8096-93f1-4400-bb3a-5a2af940532e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.896s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.558851] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1155.558997] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1155.559139] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1155.563057] env[67977]: DEBUG nova.compute.manager [None req-4245fcf7-c84a-4234-bd22-8a1cfde48e3f tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] [instance: a294e292-bc5a-4e79-8224-1cb8c201e81d] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1155.585890] env[67977]: DEBUG nova.compute.manager [None req-4245fcf7-c84a-4234-bd22-8a1cfde48e3f tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] [instance: a294e292-bc5a-4e79-8224-1cb8c201e81d] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1155.596400] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.596400] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.596529] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.596664] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.596773] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.596894] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.597016] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.597142] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.597259] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.597409] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1155.611801] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4245fcf7-c84a-4234-bd22-8a1cfde48e3f tempest-ServerShowV247Test-1404879766 tempest-ServerShowV247Test-1404879766-project-member] Lock "a294e292-bc5a-4e79-8224-1cb8c201e81d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.350s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.621869] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1155.668529] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1155.668869] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1155.670333] env[67977]: INFO nova.compute.claims [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1156.045744] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7ad0aa9-87d4-4bf3-889f-06dcdc0530dd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.054302] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-562bb66f-3d37-4a49-99b2-bfe6aa037027 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.087614] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7cd134f-51c0-4c01-8eab-1ca4666911ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.095256] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f0459db-89d3-4bc5-9754-69ddc094c442 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.108501] env[67977]: DEBUG nova.compute.provider_tree [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1156.117365] env[67977]: DEBUG nova.scheduler.client.report [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1156.130631] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.462s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1156.131123] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1156.168920] env[67977]: DEBUG nova.compute.utils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1156.169542] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1156.169710] env[67977]: DEBUG nova.network.neutron [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1156.179355] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1156.253017] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1156.259930] env[67977]: DEBUG nova.policy [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cfe74c307e674da3a83c5e17936073cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0345390d8442489bb91fc726088166a9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1156.283018] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1156.283018] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1156.283018] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1156.283451] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1156.286108] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1156.286354] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1156.286592] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1156.286758] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1156.286928] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1156.287106] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1156.287301] env[67977]: DEBUG nova.virt.hardware [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1156.288192] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91b7ba5e-0529-4cc5-a283-502e80ae6ef9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.298584] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c08ad42b-ab79-4041-bf9f-007cd38ca897 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.675917] env[67977]: DEBUG nova.network.neutron [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Successfully created port: 08acc5fd-aeb2-43e1-9a13-011ae73b0508 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1157.332709] env[67977]: DEBUG nova.compute.manager [req-acebb67d-d1b7-4b65-bd37-504b1593642c req-0142dae5-dee0-4333-bfc1-251ddcab11ce service nova] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Received event network-vif-plugged-08acc5fd-aeb2-43e1-9a13-011ae73b0508 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1157.332967] env[67977]: DEBUG oslo_concurrency.lockutils [req-acebb67d-d1b7-4b65-bd37-504b1593642c req-0142dae5-dee0-4333-bfc1-251ddcab11ce service nova] Acquiring lock "32d860b3-f438-400f-8296-e62cc662d618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1157.333148] env[67977]: DEBUG oslo_concurrency.lockutils [req-acebb67d-d1b7-4b65-bd37-504b1593642c req-0142dae5-dee0-4333-bfc1-251ddcab11ce service nova] Lock "32d860b3-f438-400f-8296-e62cc662d618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1157.333316] env[67977]: DEBUG oslo_concurrency.lockutils [req-acebb67d-d1b7-4b65-bd37-504b1593642c req-0142dae5-dee0-4333-bfc1-251ddcab11ce service nova] Lock "32d860b3-f438-400f-8296-e62cc662d618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1157.333481] env[67977]: DEBUG nova.compute.manager [req-acebb67d-d1b7-4b65-bd37-504b1593642c req-0142dae5-dee0-4333-bfc1-251ddcab11ce service nova] [instance: 32d860b3-f438-400f-8296-e62cc662d618] No waiting events found dispatching network-vif-plugged-08acc5fd-aeb2-43e1-9a13-011ae73b0508 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1157.333643] env[67977]: WARNING nova.compute.manager [req-acebb67d-d1b7-4b65-bd37-504b1593642c req-0142dae5-dee0-4333-bfc1-251ddcab11ce service nova] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Received unexpected event network-vif-plugged-08acc5fd-aeb2-43e1-9a13-011ae73b0508 for instance with vm_state building and task_state spawning. [ 1157.480442] env[67977]: DEBUG nova.network.neutron [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Successfully updated port: 08acc5fd-aeb2-43e1-9a13-011ae73b0508 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1157.496292] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquiring lock "refresh_cache-32d860b3-f438-400f-8296-e62cc662d618" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1157.496611] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquired lock "refresh_cache-32d860b3-f438-400f-8296-e62cc662d618" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1157.496611] env[67977]: DEBUG nova.network.neutron [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1157.551601] env[67977]: DEBUG nova.network.neutron [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1157.777379] env[67977]: DEBUG nova.network.neutron [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Updating instance_info_cache with network_info: [{"id": "08acc5fd-aeb2-43e1-9a13-011ae73b0508", "address": "fa:16:3e:e8:62:60", "network": {"id": "e48d6ad9-f74c-456d-bfef-bf7663724220", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-355937063-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0345390d8442489bb91fc726088166a9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2", "external-id": "nsx-vlan-transportzone-268", "segmentation_id": 268, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap08acc5fd-ae", "ovs_interfaceid": "08acc5fd-aeb2-43e1-9a13-011ae73b0508", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1157.789973] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Releasing lock "refresh_cache-32d860b3-f438-400f-8296-e62cc662d618" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1157.790436] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Instance network_info: |[{"id": "08acc5fd-aeb2-43e1-9a13-011ae73b0508", "address": "fa:16:3e:e8:62:60", "network": {"id": "e48d6ad9-f74c-456d-bfef-bf7663724220", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-355937063-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0345390d8442489bb91fc726088166a9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2", "external-id": "nsx-vlan-transportzone-268", "segmentation_id": 268, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap08acc5fd-ae", "ovs_interfaceid": "08acc5fd-aeb2-43e1-9a13-011ae73b0508", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1157.790797] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e8:62:60', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '08acc5fd-aeb2-43e1-9a13-011ae73b0508', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1157.798789] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Creating folder: Project (0345390d8442489bb91fc726088166a9). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1157.799348] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-27eff56e-7aa3-4604-9144-85638b23b034 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.811403] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Created folder: Project (0345390d8442489bb91fc726088166a9) in parent group-v693022. [ 1157.811571] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Creating folder: Instances. Parent ref: group-v693083. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1157.811820] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-06d03f47-7365-490f-9ea5-f30f30227d31 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.823841] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Created folder: Instances in parent group-v693083. [ 1157.823841] env[67977]: DEBUG oslo.service.loopingcall [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1157.823841] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1157.823841] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-84bad22a-1c62-4bbe-b868-8c5621c17de3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.854654] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1157.854654] env[67977]: value = "task-3468194" [ 1157.854654] env[67977]: _type = "Task" [ 1157.854654] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1157.863015] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468194, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1158.365211] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468194, 'name': CreateVM_Task} progress is 99%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1158.865316] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468194, 'name': CreateVM_Task} progress is 99%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1159.355835] env[67977]: DEBUG nova.compute.manager [req-e0762d3d-807c-4204-93d3-3b26b29e8639 req-78b72559-c4f6-4a6d-a445-1ef205249e4c service nova] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Received event network-changed-08acc5fd-aeb2-43e1-9a13-011ae73b0508 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1159.355998] env[67977]: DEBUG nova.compute.manager [req-e0762d3d-807c-4204-93d3-3b26b29e8639 req-78b72559-c4f6-4a6d-a445-1ef205249e4c service nova] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Refreshing instance network info cache due to event network-changed-08acc5fd-aeb2-43e1-9a13-011ae73b0508. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1159.356252] env[67977]: DEBUG oslo_concurrency.lockutils [req-e0762d3d-807c-4204-93d3-3b26b29e8639 req-78b72559-c4f6-4a6d-a445-1ef205249e4c service nova] Acquiring lock "refresh_cache-32d860b3-f438-400f-8296-e62cc662d618" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1159.356415] env[67977]: DEBUG oslo_concurrency.lockutils [req-e0762d3d-807c-4204-93d3-3b26b29e8639 req-78b72559-c4f6-4a6d-a445-1ef205249e4c service nova] Acquired lock "refresh_cache-32d860b3-f438-400f-8296-e62cc662d618" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1159.356580] env[67977]: DEBUG nova.network.neutron [req-e0762d3d-807c-4204-93d3-3b26b29e8639 req-78b72559-c4f6-4a6d-a445-1ef205249e4c service nova] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Refreshing network info cache for port 08acc5fd-aeb2-43e1-9a13-011ae73b0508 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1159.370080] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468194, 'name': CreateVM_Task, 'duration_secs': 1.310569} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1159.370544] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1159.371142] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1159.371306] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1159.371650] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1159.372112] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-19d887b1-8b08-4a11-be38-7b50a3c83e14 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.376709] env[67977]: DEBUG oslo_vmware.api [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Waiting for the task: (returnval){ [ 1159.376709] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52090b9d-6067-7515-7bbc-bb5ee46d4bbc" [ 1159.376709] env[67977]: _type = "Task" [ 1159.376709] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1159.384514] env[67977]: DEBUG oslo_vmware.api [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52090b9d-6067-7515-7bbc-bb5ee46d4bbc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1159.600496] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquiring lock "32d860b3-f438-400f-8296-e62cc662d618" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.890404] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1159.890684] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1159.891408] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1159.923809] env[67977]: DEBUG nova.network.neutron [req-e0762d3d-807c-4204-93d3-3b26b29e8639 req-78b72559-c4f6-4a6d-a445-1ef205249e4c service nova] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Updated VIF entry in instance network info cache for port 08acc5fd-aeb2-43e1-9a13-011ae73b0508. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1159.924171] env[67977]: DEBUG nova.network.neutron [req-e0762d3d-807c-4204-93d3-3b26b29e8639 req-78b72559-c4f6-4a6d-a445-1ef205249e4c service nova] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Updating instance_info_cache with network_info: [{"id": "08acc5fd-aeb2-43e1-9a13-011ae73b0508", "address": "fa:16:3e:e8:62:60", "network": {"id": "e48d6ad9-f74c-456d-bfef-bf7663724220", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-355937063-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0345390d8442489bb91fc726088166a9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2", "external-id": "nsx-vlan-transportzone-268", "segmentation_id": 268, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap08acc5fd-ae", "ovs_interfaceid": "08acc5fd-aeb2-43e1-9a13-011ae73b0508", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1159.939266] env[67977]: DEBUG oslo_concurrency.lockutils [req-e0762d3d-807c-4204-93d3-3b26b29e8639 req-78b72559-c4f6-4a6d-a445-1ef205249e4c service nova] Releasing lock "refresh_cache-32d860b3-f438-400f-8296-e62cc662d618" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1163.292371] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "5edda5cc-6295-4abe-a21e-0cf684063cb3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1163.292667] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "5edda5cc-6295-4abe-a21e-0cf684063cb3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1167.359839] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3cc51610-cc38-48e9-a008-7268194d729f tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "a61121e3-0375-41ef-9dad-0469583640ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1167.360330] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3cc51610-cc38-48e9-a008-7268194d729f tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "a61121e3-0375-41ef-9dad-0469583640ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1171.736302] env[67977]: DEBUG oslo_concurrency.lockutils [None req-32f0bee9-eea2-408b-8367-8620b12489ae tempest-InstanceActionsV221TestJSON-1058234220 tempest-InstanceActionsV221TestJSON-1058234220-project-member] Acquiring lock "8351701c-562c-4ec6-998a-a0b1a62f0c5e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1171.736594] env[67977]: DEBUG oslo_concurrency.lockutils [None req-32f0bee9-eea2-408b-8367-8620b12489ae tempest-InstanceActionsV221TestJSON-1058234220 tempest-InstanceActionsV221TestJSON-1058234220-project-member] Lock "8351701c-562c-4ec6-998a-a0b1a62f0c5e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1173.034867] env[67977]: DEBUG oslo_concurrency.lockutils [None req-564df29f-d093-4b9c-ae99-4c262f9e3e06 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] Acquiring lock "ea5d10a1-2beb-4997-beff-e6a0e24e5b9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1173.035234] env[67977]: DEBUG oslo_concurrency.lockutils [None req-564df29f-d093-4b9c-ae99-4c262f9e3e06 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] Lock "ea5d10a1-2beb-4997-beff-e6a0e24e5b9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1183.119597] env[67977]: DEBUG oslo_concurrency.lockutils [None req-74451808-c718-429c-937d-4217647056e7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "33412c29-7b03-421f-9502-4c2a639adfd7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1183.119972] env[67977]: DEBUG oslo_concurrency.lockutils [None req-74451808-c718-429c-937d-4217647056e7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "33412c29-7b03-421f-9502-4c2a639adfd7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.910920] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e317a14c-c2b2-4fb0-a2c6-1b92546f7f7d tempest-ServersNegativeTestJSON-1846221724 tempest-ServersNegativeTestJSON-1846221724-project-member] Acquiring lock "508bc9c7-8a3b-4ac8-9eda-b2d3f5916997" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1197.911250] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e317a14c-c2b2-4fb0-a2c6-1b92546f7f7d tempest-ServersNegativeTestJSON-1846221724 tempest-ServersNegativeTestJSON-1846221724-project-member] Lock "508bc9c7-8a3b-4ac8-9eda-b2d3f5916997" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1198.450544] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53042408-bbc4-4faf-9d42-414588ce70e6 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "8d6894ae-ecbf-4ad2-ae91-463c88ef5de3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1198.450854] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53042408-bbc4-4faf-9d42-414588ce70e6 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "8d6894ae-ecbf-4ad2-ae91-463c88ef5de3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1202.995800] env[67977]: WARNING oslo_vmware.rw_handles [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1202.995800] env[67977]: ERROR oslo_vmware.rw_handles [ 1202.996448] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/bf3a851c-9e08-487b-805e-bc65888b80db/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1202.998328] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1202.998586] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Copying Virtual Disk [datastore1] vmware_temp/bf3a851c-9e08-487b-805e-bc65888b80db/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/bf3a851c-9e08-487b-805e-bc65888b80db/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1202.998889] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-62de5ab4-7001-4129-8493-ab5c5aff4818 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.006613] env[67977]: DEBUG oslo_vmware.api [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for the task: (returnval){ [ 1203.006613] env[67977]: value = "task-3468195" [ 1203.006613] env[67977]: _type = "Task" [ 1203.006613] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1203.014589] env[67977]: DEBUG oslo_vmware.api [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': task-3468195, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1203.517012] env[67977]: DEBUG oslo_vmware.exceptions [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1203.517313] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1203.517859] env[67977]: ERROR nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1203.517859] env[67977]: Faults: ['InvalidArgument'] [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Traceback (most recent call last): [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] yield resources [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] self.driver.spawn(context, instance, image_meta, [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] self._fetch_image_if_missing(context, vi) [ 1203.517859] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] image_cache(vi, tmp_image_ds_loc) [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] vm_util.copy_virtual_disk( [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] session._wait_for_task(vmdk_copy_task) [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] return self.wait_for_task(task_ref) [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] return evt.wait() [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] result = hub.switch() [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1203.518317] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] return self.greenlet.switch() [ 1203.518785] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1203.518785] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] self.f(*self.args, **self.kw) [ 1203.518785] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1203.518785] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] raise exceptions.translate_fault(task_info.error) [ 1203.518785] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1203.518785] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Faults: ['InvalidArgument'] [ 1203.518785] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] [ 1203.518785] env[67977]: INFO nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Terminating instance [ 1203.519745] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1203.519948] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1203.520559] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1203.520770] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1203.520989] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-50f687b1-2f75-4e57-996c-32384ade4eff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.523296] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9395fe40-964e-4f1e-9f61-ce2545c0263a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.531429] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1203.531644] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6437ee5c-bc1a-4c09-9614-ad124b80d003 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.533966] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1203.534092] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1203.535041] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-27832447-6df7-4e91-bcd3-7dcc718cdcf3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.540312] env[67977]: DEBUG oslo_vmware.api [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Waiting for the task: (returnval){ [ 1203.540312] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52fdc6ee-712b-20ab-3045-31e1c181cb50" [ 1203.540312] env[67977]: _type = "Task" [ 1203.540312] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1203.548580] env[67977]: DEBUG oslo_vmware.api [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52fdc6ee-712b-20ab-3045-31e1c181cb50, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1204.050516] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1204.050816] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Creating directory with path [datastore1] vmware_temp/819814e0-a044-429f-a056-6f8d78265ae6/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1204.051023] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-47b8e237-c6bd-48c4-ad2b-bc4697be32c4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.070364] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Created directory with path [datastore1] vmware_temp/819814e0-a044-429f-a056-6f8d78265ae6/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1204.070579] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Fetch image to [datastore1] vmware_temp/819814e0-a044-429f-a056-6f8d78265ae6/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1204.070775] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/819814e0-a044-429f-a056-6f8d78265ae6/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1204.071584] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aebd66d-5ab0-4f69-8b4c-050c21f4e699 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.078130] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8edd8a3f-abe5-482e-8784-ace81c50119f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.087242] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f110bd-6a35-4320-bb49-eedc48af4cd9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.117389] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a807091-14fc-42d2-8aea-9c73dba3c10c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.123132] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-15e0e1f5-6d7a-426b-ac8c-5153bff14932 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.142843] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1204.192724] env[67977]: DEBUG oslo_vmware.rw_handles [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/819814e0-a044-429f-a056-6f8d78265ae6/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1204.251932] env[67977]: DEBUG oslo_vmware.rw_handles [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1204.252138] env[67977]: DEBUG oslo_vmware.rw_handles [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/819814e0-a044-429f-a056-6f8d78265ae6/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1205.096146] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1205.096500] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1205.096709] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Deleting the datastore file [datastore1] af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1205.096997] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e499e7ad-19e2-4ddc-9bdf-d94474c2d011 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1205.104664] env[67977]: DEBUG oslo_vmware.api [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for the task: (returnval){ [ 1205.104664] env[67977]: value = "task-3468197" [ 1205.104664] env[67977]: _type = "Task" [ 1205.104664] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1205.112602] env[67977]: DEBUG oslo_vmware.api [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': task-3468197, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1205.614921] env[67977]: DEBUG oslo_vmware.api [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Task: {'id': task-3468197, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068804} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1205.615125] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1205.615314] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1205.615487] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1205.615659] env[67977]: INFO nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Took 2.09 seconds to destroy the instance on the hypervisor. [ 1205.617855] env[67977]: DEBUG nova.compute.claims [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1205.618163] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1205.618470] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1206.048094] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e23f773-1d85-48ec-9d13-71d0bdff5850 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.056256] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd6ac3cf-6aad-4bc8-aa7a-62cdd053dec4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.085576] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88cf097f-76f8-4137-aedf-29c533f02ed4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.092848] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b78fe3a-78da-42c9-b728-be4586c6d115 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.106060] env[67977]: DEBUG nova.compute.provider_tree [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1206.114385] env[67977]: DEBUG nova.scheduler.client.report [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1206.128240] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.510s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.128751] env[67977]: ERROR nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1206.128751] env[67977]: Faults: ['InvalidArgument'] [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Traceback (most recent call last): [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] self.driver.spawn(context, instance, image_meta, [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] self._fetch_image_if_missing(context, vi) [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] image_cache(vi, tmp_image_ds_loc) [ 1206.128751] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] vm_util.copy_virtual_disk( [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] session._wait_for_task(vmdk_copy_task) [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] return self.wait_for_task(task_ref) [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] return evt.wait() [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] result = hub.switch() [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] return self.greenlet.switch() [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1206.129115] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] self.f(*self.args, **self.kw) [ 1206.129506] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1206.129506] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] raise exceptions.translate_fault(task_info.error) [ 1206.129506] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1206.129506] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Faults: ['InvalidArgument'] [ 1206.129506] env[67977]: ERROR nova.compute.manager [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] [ 1206.129506] env[67977]: DEBUG nova.compute.utils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1206.131156] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Build of instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f was re-scheduled: A specified parameter was not correct: fileType [ 1206.131156] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1206.131536] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1206.131708] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1206.131874] env[67977]: DEBUG nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1206.132049] env[67977]: DEBUG nova.network.neutron [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1206.502032] env[67977]: DEBUG nova.network.neutron [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1206.513383] env[67977]: INFO nova.compute.manager [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Took 0.38 seconds to deallocate network for instance. [ 1206.604819] env[67977]: INFO nova.scheduler.client.report [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Deleted allocations for instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f [ 1206.625780] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b5bafd80-c868-44b1-8bd3-0b228353dd32 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 472.492s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.626812] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 271.521s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1206.627037] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Acquiring lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1206.627252] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1206.627417] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.630047] env[67977]: INFO nova.compute.manager [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Terminating instance [ 1206.632928] env[67977]: DEBUG nova.compute.manager [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1206.632928] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1206.633162] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7c090d99-20a0-400e-964d-009abab8187a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.637836] env[67977]: DEBUG nova.compute.manager [None req-f0afa998-2ebe-495b-970d-dd324c8fb750 tempest-InstanceActionsNegativeTestJSON-546422566 tempest-InstanceActionsNegativeTestJSON-546422566-project-member] [instance: cfc3a5c1-ec4a-41c6-911f-96bc0586d17e] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1206.644101] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-375332c3-0094-418d-adbc-a4c21b6883b4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.663433] env[67977]: DEBUG nova.compute.manager [None req-f0afa998-2ebe-495b-970d-dd324c8fb750 tempest-InstanceActionsNegativeTestJSON-546422566 tempest-InstanceActionsNegativeTestJSON-546422566-project-member] [instance: cfc3a5c1-ec4a-41c6-911f-96bc0586d17e] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1206.673842] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f could not be found. [ 1206.674044] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1206.674218] env[67977]: INFO nova.compute.manager [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1206.674451] env[67977]: DEBUG oslo.service.loopingcall [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1206.674672] env[67977]: DEBUG nova.compute.manager [-] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1206.674768] env[67977]: DEBUG nova.network.neutron [-] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1206.689154] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f0afa998-2ebe-495b-970d-dd324c8fb750 tempest-InstanceActionsNegativeTestJSON-546422566 tempest-InstanceActionsNegativeTestJSON-546422566-project-member] Lock "cfc3a5c1-ec4a-41c6-911f-96bc0586d17e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.257s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.698497] env[67977]: DEBUG nova.compute.manager [None req-567aae11-b45a-4166-b527-206e03bac6d0 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2977d1a5-655c-4dda-bd9e-3664770f3b62] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1206.700918] env[67977]: DEBUG nova.network.neutron [-] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1206.707962] env[67977]: INFO nova.compute.manager [-] [instance: af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f] Took 0.03 seconds to deallocate network for instance. [ 1206.726763] env[67977]: DEBUG nova.compute.manager [None req-567aae11-b45a-4166-b527-206e03bac6d0 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2977d1a5-655c-4dda-bd9e-3664770f3b62] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1206.746299] env[67977]: DEBUG oslo_concurrency.lockutils [None req-567aae11-b45a-4166-b527-206e03bac6d0 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "2977d1a5-655c-4dda-bd9e-3664770f3b62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.318s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.755289] env[67977]: DEBUG nova.compute.manager [None req-f4055fa8-ff89-4d6e-88cf-55ba3a1982ed tempest-ServerAddressesTestJSON-1824039280 tempest-ServerAddressesTestJSON-1824039280-project-member] [instance: d0df4be4-5fb6-4f3a-b2fb-695917fe2b7d] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1206.778285] env[67977]: DEBUG nova.compute.manager [None req-f4055fa8-ff89-4d6e-88cf-55ba3a1982ed tempest-ServerAddressesTestJSON-1824039280 tempest-ServerAddressesTestJSON-1824039280-project-member] [instance: d0df4be4-5fb6-4f3a-b2fb-695917fe2b7d] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1206.799796] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c299c739-43d1-4c82-8ad8-4ccb53320c22 tempest-ServersAdminTestJSON-1318533534 tempest-ServersAdminTestJSON-1318533534-project-member] Lock "af6b3228-0e6e-4c8f-a9ef-5b5db684ec5f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.173s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.804130] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f4055fa8-ff89-4d6e-88cf-55ba3a1982ed tempest-ServerAddressesTestJSON-1824039280 tempest-ServerAddressesTestJSON-1824039280-project-member] Lock "d0df4be4-5fb6-4f3a-b2fb-695917fe2b7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.673s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.814335] env[67977]: DEBUG nova.compute.manager [None req-73346213-1e72-4300-a48a-34d4083e3167 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] [instance: 50ffd9c5-6232-4c6a-ae8c-5492cdf07e32] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1206.839476] env[67977]: DEBUG nova.compute.manager [None req-73346213-1e72-4300-a48a-34d4083e3167 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] [instance: 50ffd9c5-6232-4c6a-ae8c-5492cdf07e32] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1206.862897] env[67977]: DEBUG oslo_concurrency.lockutils [None req-73346213-1e72-4300-a48a-34d4083e3167 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] Lock "50ffd9c5-6232-4c6a-ae8c-5492cdf07e32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.686s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.875929] env[67977]: DEBUG nova.compute.manager [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] [instance: fd7c6688-4e12-4186-9235-b2ea93592dae] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1206.913020] env[67977]: DEBUG nova.compute.manager [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] [instance: fd7c6688-4e12-4186-9235-b2ea93592dae] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1206.937160] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Lock "fd7c6688-4e12-4186-9235-b2ea93592dae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.673s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.947536] env[67977]: DEBUG nova.compute.manager [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] [instance: 16c81008-fc75-4722-99c2-bfcdb3121d72] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1206.971660] env[67977]: DEBUG nova.compute.manager [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] [instance: 16c81008-fc75-4722-99c2-bfcdb3121d72] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1206.990670] env[67977]: DEBUG oslo_concurrency.lockutils [None req-1a8319ac-043e-4381-acdc-7e571a1f63d4 tempest-MultipleCreateTestJSON-1530900820 tempest-MultipleCreateTestJSON-1530900820-project-member] Lock "16c81008-fc75-4722-99c2-bfcdb3121d72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.693s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.002014] env[67977]: DEBUG nova.compute.manager [None req-068e211b-dc93-4099-90ec-09b6259fe94c tempest-ServersNegativeTestMultiTenantJSON-1350597422 tempest-ServersNegativeTestMultiTenantJSON-1350597422-project-member] [instance: e8092e46-3f2e-4b1a-ad47-e8a5c16db13c] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1207.025503] env[67977]: DEBUG nova.compute.manager [None req-068e211b-dc93-4099-90ec-09b6259fe94c tempest-ServersNegativeTestMultiTenantJSON-1350597422 tempest-ServersNegativeTestMultiTenantJSON-1350597422-project-member] [instance: e8092e46-3f2e-4b1a-ad47-e8a5c16db13c] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1207.047926] env[67977]: DEBUG oslo_concurrency.lockutils [None req-068e211b-dc93-4099-90ec-09b6259fe94c tempest-ServersNegativeTestMultiTenantJSON-1350597422 tempest-ServersNegativeTestMultiTenantJSON-1350597422-project-member] Lock "e8092e46-3f2e-4b1a-ad47-e8a5c16db13c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.836s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.061027] env[67977]: DEBUG nova.compute.manager [None req-c92e587b-96db-460b-9283-34bd62954090 tempest-AttachInterfacesUnderV243Test-1161791688 tempest-AttachInterfacesUnderV243Test-1161791688-project-member] [instance: 3a02e857-fdf9-47fb-a464-7e3683c1ac93] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1207.087761] env[67977]: DEBUG nova.compute.manager [None req-c92e587b-96db-460b-9283-34bd62954090 tempest-AttachInterfacesUnderV243Test-1161791688 tempest-AttachInterfacesUnderV243Test-1161791688-project-member] [instance: 3a02e857-fdf9-47fb-a464-7e3683c1ac93] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1207.107479] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c92e587b-96db-460b-9283-34bd62954090 tempest-AttachInterfacesUnderV243Test-1161791688 tempest-AttachInterfacesUnderV243Test-1161791688-project-member] Lock "3a02e857-fdf9-47fb-a464-7e3683c1ac93" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.969s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.117612] env[67977]: DEBUG nova.compute.manager [None req-16e1c2c4-e4ac-4682-9230-9aef90ea05af tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: c2cc96a9-e755-4c6c-b34a-eb28c9c38066] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1207.142035] env[67977]: DEBUG nova.compute.manager [None req-16e1c2c4-e4ac-4682-9230-9aef90ea05af tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: c2cc96a9-e755-4c6c-b34a-eb28c9c38066] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1207.163293] env[67977]: DEBUG oslo_concurrency.lockutils [None req-16e1c2c4-e4ac-4682-9230-9aef90ea05af tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "c2cc96a9-e755-4c6c-b34a-eb28c9c38066" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.682s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.172238] env[67977]: DEBUG nova.compute.manager [None req-3cb17c46-33f5-4b9d-b713-6c4fff757516 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] [instance: d69d2ea1-575c-4616-a3b4-5b8f381d1fa7] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1207.195819] env[67977]: DEBUG nova.compute.manager [None req-3cb17c46-33f5-4b9d-b713-6c4fff757516 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] [instance: d69d2ea1-575c-4616-a3b4-5b8f381d1fa7] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1207.216016] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3cb17c46-33f5-4b9d-b713-6c4fff757516 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] Lock "d69d2ea1-575c-4616-a3b4-5b8f381d1fa7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.621s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.225573] env[67977]: DEBUG nova.compute.manager [None req-b7769a77-5d12-45f6-8b6d-233831448ad2 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] [instance: 1eb71176-689e-4a9d-8852-b289c3d1abbd] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1207.249156] env[67977]: DEBUG nova.compute.manager [None req-b7769a77-5d12-45f6-8b6d-233831448ad2 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] [instance: 1eb71176-689e-4a9d-8852-b289c3d1abbd] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1207.270701] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b7769a77-5d12-45f6-8b6d-233831448ad2 tempest-ServerRescueNegativeTestJSON-1640851732 tempest-ServerRescueNegativeTestJSON-1640851732-project-member] Lock "1eb71176-689e-4a9d-8852-b289c3d1abbd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.327s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.280229] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1207.334030] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1207.334309] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1207.335897] env[67977]: INFO nova.compute.claims [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1207.642683] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87f0a78b-6b91-471c-b660-ee6483c7ebc8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.650746] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbd00162-4359-432a-98e1-34ccb1401b2d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.680737] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63c7fef9-adaf-4904-a90f-64eb96814554 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.687233] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ac7537e-852c-45d2-8428-bed35f0bf73c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.699970] env[67977]: DEBUG nova.compute.provider_tree [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1207.708493] env[67977]: DEBUG nova.scheduler.client.report [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1207.721640] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.387s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.722101] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1207.752464] env[67977]: DEBUG nova.compute.utils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1207.753780] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1207.753965] env[67977]: DEBUG nova.network.neutron [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1207.761225] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1207.810138] env[67977]: DEBUG nova.policy [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b401e28decd40eca7b2e2d4e88ea43b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fa83fa9567bf41679eaa6b42ebcbe9bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1207.824890] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1207.849017] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1207.849379] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1207.849495] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1207.849704] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1207.849855] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1207.850011] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1207.850227] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1207.850389] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1207.850559] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1207.850718] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1207.850890] env[67977]: DEBUG nova.virt.hardware [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1207.851807] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660857c5-be94-4f1e-8e70-35a8883a4399 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.862036] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3b0e202-d9d0-4724-b924-519e08cf9665 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.238410] env[67977]: DEBUG nova.network.neutron [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Successfully created port: 0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1208.946000] env[67977]: DEBUG nova.compute.manager [req-c607f5f4-0cc8-4e12-b40e-1de7cda8a124 req-29ca91ad-f5ff-4929-bdc7-474690bf538e service nova] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Received event network-vif-plugged-0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1208.946243] env[67977]: DEBUG oslo_concurrency.lockutils [req-c607f5f4-0cc8-4e12-b40e-1de7cda8a124 req-29ca91ad-f5ff-4929-bdc7-474690bf538e service nova] Acquiring lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1208.946508] env[67977]: DEBUG oslo_concurrency.lockutils [req-c607f5f4-0cc8-4e12-b40e-1de7cda8a124 req-29ca91ad-f5ff-4929-bdc7-474690bf538e service nova] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1208.946685] env[67977]: DEBUG oslo_concurrency.lockutils [req-c607f5f4-0cc8-4e12-b40e-1de7cda8a124 req-29ca91ad-f5ff-4929-bdc7-474690bf538e service nova] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1208.946853] env[67977]: DEBUG nova.compute.manager [req-c607f5f4-0cc8-4e12-b40e-1de7cda8a124 req-29ca91ad-f5ff-4929-bdc7-474690bf538e service nova] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] No waiting events found dispatching network-vif-plugged-0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1208.947146] env[67977]: WARNING nova.compute.manager [req-c607f5f4-0cc8-4e12-b40e-1de7cda8a124 req-29ca91ad-f5ff-4929-bdc7-474690bf538e service nova] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Received unexpected event network-vif-plugged-0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d for instance with vm_state building and task_state spawning. [ 1209.040645] env[67977]: DEBUG nova.network.neutron [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Successfully updated port: 0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1209.056536] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquiring lock "refresh_cache-27743458-4ef0-4ceb-a0bf-cac219dbdc35" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1209.056697] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquired lock "refresh_cache-27743458-4ef0-4ceb-a0bf-cac219dbdc35" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1209.056851] env[67977]: DEBUG nova.network.neutron [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1209.111132] env[67977]: DEBUG nova.network.neutron [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1209.310433] env[67977]: DEBUG nova.network.neutron [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Updating instance_info_cache with network_info: [{"id": "0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d", "address": "fa:16:3e:b4:09:8a", "network": {"id": "3a9af5b2-dbea-4d4a-a3d1-7b503621344f", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1259516709-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fa83fa9567bf41679eaa6b42ebcbe9bd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7041d198-66a3-40de-bf7d-cfc036e6ed69", "external-id": "nsx-vlan-transportzone-278", "segmentation_id": 278, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c65c9f8-87", "ovs_interfaceid": "0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1209.322755] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Releasing lock "refresh_cache-27743458-4ef0-4ceb-a0bf-cac219dbdc35" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1209.323060] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Instance network_info: |[{"id": "0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d", "address": "fa:16:3e:b4:09:8a", "network": {"id": "3a9af5b2-dbea-4d4a-a3d1-7b503621344f", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1259516709-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fa83fa9567bf41679eaa6b42ebcbe9bd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7041d198-66a3-40de-bf7d-cfc036e6ed69", "external-id": "nsx-vlan-transportzone-278", "segmentation_id": 278, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c65c9f8-87", "ovs_interfaceid": "0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1209.323457] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b4:09:8a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7041d198-66a3-40de-bf7d-cfc036e6ed69', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1209.331315] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Creating folder: Project (fa83fa9567bf41679eaa6b42ebcbe9bd). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1209.331837] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f506aa49-886d-4a95-97f4-b67b3794fdf8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.343216] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Created folder: Project (fa83fa9567bf41679eaa6b42ebcbe9bd) in parent group-v693022. [ 1209.343404] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Creating folder: Instances. Parent ref: group-v693086. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1209.343626] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a0c0117b-df5c-4db0-b66d-c905c2875cf3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.352113] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Created folder: Instances in parent group-v693086. [ 1209.352344] env[67977]: DEBUG oslo.service.loopingcall [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1209.352525] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1209.352711] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d010906d-9bfd-4b54-b264-2081fd58c82b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.372270] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1209.372270] env[67977]: value = "task-3468200" [ 1209.372270] env[67977]: _type = "Task" [ 1209.372270] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1209.379290] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468200, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1209.775129] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1209.776027] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1209.776027] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1209.881818] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468200, 'name': CreateVM_Task, 'duration_secs': 0.311354} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1209.881944] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1209.882645] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1209.882819] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1209.883207] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1209.883432] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bbe2bc0b-1ff1-4f52-98d8-5b05905bd7ac {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.887754] env[67977]: DEBUG oslo_vmware.api [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Waiting for the task: (returnval){ [ 1209.887754] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523fa575-89bc-9d7e-2968-8975cd0b1ad3" [ 1209.887754] env[67977]: _type = "Task" [ 1209.887754] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1209.894965] env[67977]: DEBUG oslo_vmware.api [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523fa575-89bc-9d7e-2968-8975cd0b1ad3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1210.398410] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1210.398680] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1210.398896] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1210.775570] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1210.977287] env[67977]: DEBUG nova.compute.manager [req-9cba718d-6210-4341-9be0-da6181a5aa42 req-fa8dfab0-caa5-48d4-9671-632f0b5a1831 service nova] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Received event network-changed-0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1210.977539] env[67977]: DEBUG nova.compute.manager [req-9cba718d-6210-4341-9be0-da6181a5aa42 req-fa8dfab0-caa5-48d4-9671-632f0b5a1831 service nova] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Refreshing instance network info cache due to event network-changed-0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1210.977706] env[67977]: DEBUG oslo_concurrency.lockutils [req-9cba718d-6210-4341-9be0-da6181a5aa42 req-fa8dfab0-caa5-48d4-9671-632f0b5a1831 service nova] Acquiring lock "refresh_cache-27743458-4ef0-4ceb-a0bf-cac219dbdc35" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1210.977864] env[67977]: DEBUG oslo_concurrency.lockutils [req-9cba718d-6210-4341-9be0-da6181a5aa42 req-fa8dfab0-caa5-48d4-9671-632f0b5a1831 service nova] Acquired lock "refresh_cache-27743458-4ef0-4ceb-a0bf-cac219dbdc35" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1210.977968] env[67977]: DEBUG nova.network.neutron [req-9cba718d-6210-4341-9be0-da6181a5aa42 req-fa8dfab0-caa5-48d4-9671-632f0b5a1831 service nova] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Refreshing network info cache for port 0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1211.192725] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquiring lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1211.296740] env[67977]: DEBUG nova.network.neutron [req-9cba718d-6210-4341-9be0-da6181a5aa42 req-fa8dfab0-caa5-48d4-9671-632f0b5a1831 service nova] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Updated VIF entry in instance network info cache for port 0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1211.297107] env[67977]: DEBUG nova.network.neutron [req-9cba718d-6210-4341-9be0-da6181a5aa42 req-fa8dfab0-caa5-48d4-9671-632f0b5a1831 service nova] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Updating instance_info_cache with network_info: [{"id": "0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d", "address": "fa:16:3e:b4:09:8a", "network": {"id": "3a9af5b2-dbea-4d4a-a3d1-7b503621344f", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1259516709-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fa83fa9567bf41679eaa6b42ebcbe9bd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7041d198-66a3-40de-bf7d-cfc036e6ed69", "external-id": "nsx-vlan-transportzone-278", "segmentation_id": 278, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c65c9f8-87", "ovs_interfaceid": "0c65c9f8-874b-47ce-9f6d-2e4fdaebf93d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1211.306938] env[67977]: DEBUG oslo_concurrency.lockutils [req-9cba718d-6210-4341-9be0-da6181a5aa42 req-fa8dfab0-caa5-48d4-9671-632f0b5a1831 service nova] Releasing lock "refresh_cache-27743458-4ef0-4ceb-a0bf-cac219dbdc35" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1211.769958] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1211.774578] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1212.775584] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1212.775852] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1212.786014] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1212.786262] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1212.786425] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1212.786578] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1212.787659] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dc2412b-8df7-4309-8903-472828b31187 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.796336] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93b229ec-4e10-406a-987b-90e9b412b0a9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.811200] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af6fc8a0-e5da-47fa-923d-feb4d980f07c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.817608] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0597fac0-db59-4744-abfe-2c2f719419d0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.846490] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180951MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1212.846640] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1212.846828] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1212.921728] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b22ae1a7-c9b8-464b-a81c-73144a0176be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.921922] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.922038] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.922901] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.922901] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.922901] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.922901] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.923102] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.923102] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.923243] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1212.935629] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance c82c83e0-ae02-4e15-8f81-23e4ae8ecc5c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1212.947037] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1212.956908] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1212.967562] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2eb73f59-f6c3-4816-b545-9ab391697785 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1212.978107] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 206998ab-992a-4bc0-b176-eb490b3cb479 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1212.988647] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f66d73a3-535f-45ba-a9c4-2436da6ea511 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1212.997786] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 120b8d8f-3f0e-4cb4-a112-695d6365788a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1213.006777] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1213.015858] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a61121e3-0375-41ef-9dad-0469583640ab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1213.024818] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8351701c-562c-4ec6-998a-a0b1a62f0c5e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1213.033857] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ea5d10a1-2beb-4997-beff-e6a0e24e5b9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1213.042563] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 33412c29-7b03-421f-9502-4c2a639adfd7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1213.052012] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 508bc9c7-8a3b-4ac8-9eda-b2d3f5916997 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1213.060933] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8d6894ae-ecbf-4ad2-ae91-463c88ef5de3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1213.061177] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1213.061325] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1213.332032] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e529a59-9283-46ed-8faa-b9c1524d0986 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.339865] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f74c0ac-b8e3-404a-a8c4-b916a1a47aec {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.368626] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5ebb5e2-71c9-499a-9c0b-8f8cba5cb326 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.376271] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90ab6908-f40e-46b4-8962-f456feec202c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.389787] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1213.398145] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1213.414508] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1213.414695] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1214.414226] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1214.775751] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1214.775941] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1214.776079] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1214.802110] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.802281] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.802413] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.802541] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.802664] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.802786] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.802906] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.803103] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.803265] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.803392] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1214.803515] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1215.294993] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "e77a441b-952b-42c0-907f-e30888e505a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1215.295249] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "e77a441b-952b-42c0-907f-e30888e505a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1223.437868] env[67977]: DEBUG oslo_concurrency.lockutils [None req-187eec75-b383-4799-9ace-f9e8ed512ff9 tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] Acquiring lock "1e05e7be-d468-4908-a1e4-8a11064277b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1223.438181] env[67977]: DEBUG oslo_concurrency.lockutils [None req-187eec75-b383-4799-9ace-f9e8ed512ff9 tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] Lock "1e05e7be-d468-4908-a1e4-8a11064277b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1223.632033] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e936d7e9-964b-4423-967d-fe4e52b2435c tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] Acquiring lock "ee368449-13fb-431d-9ae5-f8c08d777336" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1223.632209] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e936d7e9-964b-4423-967d-fe4e52b2435c tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] Lock "ee368449-13fb-431d-9ae5-f8c08d777336" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1252.264813] env[67977]: WARNING oslo_vmware.rw_handles [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1252.264813] env[67977]: ERROR oslo_vmware.rw_handles [ 1252.265514] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/819814e0-a044-429f-a056-6f8d78265ae6/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1252.267352] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1252.267594] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Copying Virtual Disk [datastore1] vmware_temp/819814e0-a044-429f-a056-6f8d78265ae6/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/819814e0-a044-429f-a056-6f8d78265ae6/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1252.267886] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3dba7e28-75b5-431b-8a9b-b4ec4c3e66a8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.275850] env[67977]: DEBUG oslo_vmware.api [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Waiting for the task: (returnval){ [ 1252.275850] env[67977]: value = "task-3468201" [ 1252.275850] env[67977]: _type = "Task" [ 1252.275850] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1252.283532] env[67977]: DEBUG oslo_vmware.api [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Task: {'id': task-3468201, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1252.787058] env[67977]: DEBUG oslo_vmware.exceptions [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1252.787058] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1252.787391] env[67977]: ERROR nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1252.787391] env[67977]: Faults: ['InvalidArgument'] [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Traceback (most recent call last): [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] yield resources [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] self.driver.spawn(context, instance, image_meta, [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] self._fetch_image_if_missing(context, vi) [ 1252.787391] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] image_cache(vi, tmp_image_ds_loc) [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] vm_util.copy_virtual_disk( [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] session._wait_for_task(vmdk_copy_task) [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] return self.wait_for_task(task_ref) [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] return evt.wait() [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] result = hub.switch() [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1252.787747] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] return self.greenlet.switch() [ 1252.788126] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1252.788126] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] self.f(*self.args, **self.kw) [ 1252.788126] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1252.788126] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] raise exceptions.translate_fault(task_info.error) [ 1252.788126] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1252.788126] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Faults: ['InvalidArgument'] [ 1252.788126] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] [ 1252.788126] env[67977]: INFO nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Terminating instance [ 1252.789307] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1252.789510] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1252.789857] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4086c14d-ca7e-44e8-8493-9883e376f408 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.791913] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1252.792116] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1252.792836] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98cdb2ca-c745-4f65-82e2-63b78ef464ab {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.799816] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1252.800094] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5fc06c7f-7242-4eb2-9462-2878332aa467 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.802464] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1252.802658] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1252.803326] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2ae10770-5687-42d9-abc6-c44d855e7953 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.808108] env[67977]: DEBUG oslo_vmware.api [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Waiting for the task: (returnval){ [ 1252.808108] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]520eeb1b-26b7-6f74-304c-4729ec5f0249" [ 1252.808108] env[67977]: _type = "Task" [ 1252.808108] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1252.815891] env[67977]: DEBUG oslo_vmware.api [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]520eeb1b-26b7-6f74-304c-4729ec5f0249, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1252.866574] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1252.866786] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1252.866967] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Deleting the datastore file [datastore1] b22ae1a7-c9b8-464b-a81c-73144a0176be {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1252.867261] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f75447d9-11de-4180-8281-f8266a5ed697 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.874156] env[67977]: DEBUG oslo_vmware.api [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Waiting for the task: (returnval){ [ 1252.874156] env[67977]: value = "task-3468203" [ 1252.874156] env[67977]: _type = "Task" [ 1252.874156] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1252.882485] env[67977]: DEBUG oslo_vmware.api [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Task: {'id': task-3468203, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1253.318275] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1253.318579] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Creating directory with path [datastore1] vmware_temp/988464e9-70c0-42b6-841f-7fb66c1b358c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1253.318799] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-733af285-2220-4cd4-8365-1f2eba3643a1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.329238] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Created directory with path [datastore1] vmware_temp/988464e9-70c0-42b6-841f-7fb66c1b358c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1253.329421] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Fetch image to [datastore1] vmware_temp/988464e9-70c0-42b6-841f-7fb66c1b358c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1253.329608] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/988464e9-70c0-42b6-841f-7fb66c1b358c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1253.330323] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f3ea3ec-3feb-41b0-a0f3-1854c73d23e8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.336784] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57a96384-93c2-4306-8c80-05771e8d0c65 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.345387] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62f0ca91-bc66-441b-8dda-f769181ed064 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.378859] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-922e3e9a-17bd-4fd2-a4b3-92f6b57b2483 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.385587] env[67977]: DEBUG oslo_vmware.api [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Task: {'id': task-3468203, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074588} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1253.387011] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1253.387205] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1253.387379] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1253.387709] env[67977]: INFO nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1253.389314] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6b468e6b-2e74-4132-9dcb-44d817fa50a1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.391157] env[67977]: DEBUG nova.compute.claims [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1253.391333] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1253.391541] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1253.417205] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1253.472630] env[67977]: DEBUG oslo_vmware.rw_handles [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/988464e9-70c0-42b6-841f-7fb66c1b358c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1253.531687] env[67977]: DEBUG oslo_vmware.rw_handles [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1253.531877] env[67977]: DEBUG oslo_vmware.rw_handles [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/988464e9-70c0-42b6-841f-7fb66c1b358c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1253.790675] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1410e803-8b94-4d62-b3dd-8b300ef5fa03 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.798295] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20b7ac75-85f0-4988-9b38-ea49ad393bb8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.828315] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-189e0778-750d-4fbf-8a0e-834067dfffca {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.835139] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7d66538-9aca-41ab-9625-983f86386651 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.847778] env[67977]: DEBUG nova.compute.provider_tree [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1253.856748] env[67977]: DEBUG nova.scheduler.client.report [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1253.871230] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.480s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1253.871802] env[67977]: ERROR nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1253.871802] env[67977]: Faults: ['InvalidArgument'] [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Traceback (most recent call last): [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] self.driver.spawn(context, instance, image_meta, [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] self._fetch_image_if_missing(context, vi) [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] image_cache(vi, tmp_image_ds_loc) [ 1253.871802] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] vm_util.copy_virtual_disk( [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] session._wait_for_task(vmdk_copy_task) [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] return self.wait_for_task(task_ref) [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] return evt.wait() [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] result = hub.switch() [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] return self.greenlet.switch() [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1253.872249] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] self.f(*self.args, **self.kw) [ 1253.872813] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1253.872813] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] raise exceptions.translate_fault(task_info.error) [ 1253.872813] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1253.872813] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Faults: ['InvalidArgument'] [ 1253.872813] env[67977]: ERROR nova.compute.manager [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] [ 1253.872813] env[67977]: DEBUG nova.compute.utils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1253.873880] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Build of instance b22ae1a7-c9b8-464b-a81c-73144a0176be was re-scheduled: A specified parameter was not correct: fileType [ 1253.873880] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1253.874269] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1253.874442] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1253.874614] env[67977]: DEBUG nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1253.874776] env[67977]: DEBUG nova.network.neutron [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1254.233403] env[67977]: DEBUG nova.network.neutron [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1254.247028] env[67977]: INFO nova.compute.manager [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Took 0.37 seconds to deallocate network for instance. [ 1254.344025] env[67977]: INFO nova.scheduler.client.report [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Deleted allocations for instance b22ae1a7-c9b8-464b-a81c-73144a0176be [ 1254.366603] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ca993657-f9d6-4201-990f-974c102801ff tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "b22ae1a7-c9b8-464b-a81c-73144a0176be" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 513.300s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.367745] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "b22ae1a7-c9b8-464b-a81c-73144a0176be" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 314.706s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1254.367967] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Acquiring lock "b22ae1a7-c9b8-464b-a81c-73144a0176be-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1254.368766] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "b22ae1a7-c9b8-464b-a81c-73144a0176be-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1254.368766] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "b22ae1a7-c9b8-464b-a81c-73144a0176be-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.370832] env[67977]: INFO nova.compute.manager [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Terminating instance [ 1254.372324] env[67977]: DEBUG nova.compute.manager [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1254.372525] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1254.372986] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bc94678b-fc7a-4daf-b446-b81972410724 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.381893] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c023576-51e9-4133-9467-42ddda46b4f5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.392894] env[67977]: DEBUG nova.compute.manager [None req-db7a3000-5043-4cc2-94f5-2886aab67c87 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: c82c83e0-ae02-4e15-8f81-23e4ae8ecc5c] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1254.412809] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b22ae1a7-c9b8-464b-a81c-73144a0176be could not be found. [ 1254.413087] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1254.413291] env[67977]: INFO nova.compute.manager [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1254.413556] env[67977]: DEBUG oslo.service.loopingcall [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1254.413783] env[67977]: DEBUG nova.compute.manager [-] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1254.413881] env[67977]: DEBUG nova.network.neutron [-] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1254.425906] env[67977]: DEBUG nova.compute.manager [None req-db7a3000-5043-4cc2-94f5-2886aab67c87 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: c82c83e0-ae02-4e15-8f81-23e4ae8ecc5c] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1254.437915] env[67977]: DEBUG nova.network.neutron [-] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1254.447395] env[67977]: INFO nova.compute.manager [-] [instance: b22ae1a7-c9b8-464b-a81c-73144a0176be] Took 0.03 seconds to deallocate network for instance. [ 1254.451936] env[67977]: DEBUG oslo_concurrency.lockutils [None req-db7a3000-5043-4cc2-94f5-2886aab67c87 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "c82c83e0-ae02-4e15-8f81-23e4ae8ecc5c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.122s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.462929] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1254.521779] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1254.522055] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1254.523528] env[67977]: INFO nova.compute.claims [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1254.556695] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d1c8cd14-2f3f-4456-a4e4-0185b65368b6 tempest-ServerPasswordTestJSON-1355607498 tempest-ServerPasswordTestJSON-1355607498-project-member] Lock "b22ae1a7-c9b8-464b-a81c-73144a0176be" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.188s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.847128] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9861064-5750-4514-8c8c-311b322366c5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.854849] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1194618f-60a6-4b3f-8f5e-d852a713d6cc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.885166] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-902445e2-764e-498a-a95e-06b6a7cb1686 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.892391] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d4c3199-8139-4f90-8641-080d3f8013b6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.905146] env[67977]: DEBUG nova.compute.provider_tree [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1254.913608] env[67977]: DEBUG nova.scheduler.client.report [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1254.925667] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.404s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.926154] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1254.964086] env[67977]: DEBUG nova.compute.utils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1254.965381] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1254.965569] env[67977]: DEBUG nova.network.neutron [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1254.975094] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1255.038064] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1255.064474] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1255.064714] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1255.064873] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1255.065157] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1255.065335] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1255.065485] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1255.065707] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1255.065901] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1255.066089] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1255.066297] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1255.066443] env[67977]: DEBUG nova.virt.hardware [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1255.067421] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffdaa0cb-2331-4d84-bbb1-80cafa580ea0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.071173] env[67977]: DEBUG nova.policy [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9edadfabae414eb9843451bbb1b931ad', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb5c71d2daaa48f09f9f32a17b9d41c6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1255.078057] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c9bc3f8-d9c3-48af-863b-e28efdb7caf4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.486049] env[67977]: DEBUG nova.network.neutron [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Successfully created port: 6b6a634e-414d-4941-8f58-814fbd85cbf0 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1256.292846] env[67977]: DEBUG nova.compute.manager [req-9624d4e8-8830-40c3-a90f-c16093f621e9 req-f6b1ac8f-6204-45a7-b595-f0d826f6a2ce service nova] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Received event network-vif-plugged-6b6a634e-414d-4941-8f58-814fbd85cbf0 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1256.293204] env[67977]: DEBUG oslo_concurrency.lockutils [req-9624d4e8-8830-40c3-a90f-c16093f621e9 req-f6b1ac8f-6204-45a7-b595-f0d826f6a2ce service nova] Acquiring lock "8f1440c5-e712-4635-9f02-f9cda12da693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1256.293314] env[67977]: DEBUG oslo_concurrency.lockutils [req-9624d4e8-8830-40c3-a90f-c16093f621e9 req-f6b1ac8f-6204-45a7-b595-f0d826f6a2ce service nova] Lock "8f1440c5-e712-4635-9f02-f9cda12da693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1256.293456] env[67977]: DEBUG oslo_concurrency.lockutils [req-9624d4e8-8830-40c3-a90f-c16093f621e9 req-f6b1ac8f-6204-45a7-b595-f0d826f6a2ce service nova] Lock "8f1440c5-e712-4635-9f02-f9cda12da693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1256.293625] env[67977]: DEBUG nova.compute.manager [req-9624d4e8-8830-40c3-a90f-c16093f621e9 req-f6b1ac8f-6204-45a7-b595-f0d826f6a2ce service nova] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] No waiting events found dispatching network-vif-plugged-6b6a634e-414d-4941-8f58-814fbd85cbf0 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1256.293790] env[67977]: WARNING nova.compute.manager [req-9624d4e8-8830-40c3-a90f-c16093f621e9 req-f6b1ac8f-6204-45a7-b595-f0d826f6a2ce service nova] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Received unexpected event network-vif-plugged-6b6a634e-414d-4941-8f58-814fbd85cbf0 for instance with vm_state building and task_state spawning. [ 1256.311091] env[67977]: DEBUG nova.network.neutron [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Successfully updated port: 6b6a634e-414d-4941-8f58-814fbd85cbf0 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1256.323600] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "refresh_cache-8f1440c5-e712-4635-9f02-f9cda12da693" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1256.323831] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquired lock "refresh_cache-8f1440c5-e712-4635-9f02-f9cda12da693" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1256.323995] env[67977]: DEBUG nova.network.neutron [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1256.390589] env[67977]: DEBUG nova.network.neutron [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1256.569812] env[67977]: DEBUG nova.network.neutron [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Updating instance_info_cache with network_info: [{"id": "6b6a634e-414d-4941-8f58-814fbd85cbf0", "address": "fa:16:3e:76:01:0f", "network": {"id": "444f19a5-c228-4ca2-ab8f-91fb58200775", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1364974740-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb5c71d2daaa48f09f9f32a17b9d41c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b033f4d-2e92-4702-add6-410a29d3f251", "external-id": "nsx-vlan-transportzone-649", "segmentation_id": 649, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b6a634e-41", "ovs_interfaceid": "6b6a634e-414d-4941-8f58-814fbd85cbf0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1256.584810] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Releasing lock "refresh_cache-8f1440c5-e712-4635-9f02-f9cda12da693" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1256.584810] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Instance network_info: |[{"id": "6b6a634e-414d-4941-8f58-814fbd85cbf0", "address": "fa:16:3e:76:01:0f", "network": {"id": "444f19a5-c228-4ca2-ab8f-91fb58200775", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1364974740-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb5c71d2daaa48f09f9f32a17b9d41c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b033f4d-2e92-4702-add6-410a29d3f251", "external-id": "nsx-vlan-transportzone-649", "segmentation_id": 649, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b6a634e-41", "ovs_interfaceid": "6b6a634e-414d-4941-8f58-814fbd85cbf0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1256.585163] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:76:01:0f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b033f4d-2e92-4702-add6-410a29d3f251', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6b6a634e-414d-4941-8f58-814fbd85cbf0', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1256.592802] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Creating folder: Project (eb5c71d2daaa48f09f9f32a17b9d41c6). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1256.593337] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-718214a9-799f-4f6d-8ac5-50aa01dfd87d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.603427] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Created folder: Project (eb5c71d2daaa48f09f9f32a17b9d41c6) in parent group-v693022. [ 1256.603607] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Creating folder: Instances. Parent ref: group-v693089. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1256.603821] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b3e1df9f-dd5e-4ac6-9e4b-2dc9441a7606 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.612540] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Created folder: Instances in parent group-v693089. [ 1256.612754] env[67977]: DEBUG oslo.service.loopingcall [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1256.612931] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1256.613131] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-17c73685-06d4-4e6b-8c06-3347d433890b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.632421] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1256.632421] env[67977]: value = "task-3468206" [ 1256.632421] env[67977]: _type = "Task" [ 1256.632421] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1256.639519] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468206, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1257.142158] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468206, 'name': CreateVM_Task, 'duration_secs': 0.299024} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1257.142321] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1257.142956] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1257.143138] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1257.143459] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1257.143714] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-deb7e05d-cced-4d44-89bd-30b9f838d3bd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.148386] env[67977]: DEBUG oslo_vmware.api [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for the task: (returnval){ [ 1257.148386] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523226f8-d535-a369-1a9e-d51af8116fbf" [ 1257.148386] env[67977]: _type = "Task" [ 1257.148386] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1257.157993] env[67977]: DEBUG oslo_vmware.api [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523226f8-d535-a369-1a9e-d51af8116fbf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1257.658211] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1257.658478] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1257.658730] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1258.348695] env[67977]: DEBUG nova.compute.manager [req-764824a4-e491-4d68-a5fb-6b8abbbc176b req-a7f2525a-291f-4300-a676-b55049b44942 service nova] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Received event network-changed-6b6a634e-414d-4941-8f58-814fbd85cbf0 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1258.349017] env[67977]: DEBUG nova.compute.manager [req-764824a4-e491-4d68-a5fb-6b8abbbc176b req-a7f2525a-291f-4300-a676-b55049b44942 service nova] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Refreshing instance network info cache due to event network-changed-6b6a634e-414d-4941-8f58-814fbd85cbf0. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1258.349138] env[67977]: DEBUG oslo_concurrency.lockutils [req-764824a4-e491-4d68-a5fb-6b8abbbc176b req-a7f2525a-291f-4300-a676-b55049b44942 service nova] Acquiring lock "refresh_cache-8f1440c5-e712-4635-9f02-f9cda12da693" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1258.349259] env[67977]: DEBUG oslo_concurrency.lockutils [req-764824a4-e491-4d68-a5fb-6b8abbbc176b req-a7f2525a-291f-4300-a676-b55049b44942 service nova] Acquired lock "refresh_cache-8f1440c5-e712-4635-9f02-f9cda12da693" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1258.349417] env[67977]: DEBUG nova.network.neutron [req-764824a4-e491-4d68-a5fb-6b8abbbc176b req-a7f2525a-291f-4300-a676-b55049b44942 service nova] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Refreshing network info cache for port 6b6a634e-414d-4941-8f58-814fbd85cbf0 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1258.856993] env[67977]: DEBUG nova.network.neutron [req-764824a4-e491-4d68-a5fb-6b8abbbc176b req-a7f2525a-291f-4300-a676-b55049b44942 service nova] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Updated VIF entry in instance network info cache for port 6b6a634e-414d-4941-8f58-814fbd85cbf0. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1258.857362] env[67977]: DEBUG nova.network.neutron [req-764824a4-e491-4d68-a5fb-6b8abbbc176b req-a7f2525a-291f-4300-a676-b55049b44942 service nova] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Updating instance_info_cache with network_info: [{"id": "6b6a634e-414d-4941-8f58-814fbd85cbf0", "address": "fa:16:3e:76:01:0f", "network": {"id": "444f19a5-c228-4ca2-ab8f-91fb58200775", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1364974740-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb5c71d2daaa48f09f9f32a17b9d41c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b033f4d-2e92-4702-add6-410a29d3f251", "external-id": "nsx-vlan-transportzone-649", "segmentation_id": 649, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b6a634e-41", "ovs_interfaceid": "6b6a634e-414d-4941-8f58-814fbd85cbf0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1258.866635] env[67977]: DEBUG oslo_concurrency.lockutils [req-764824a4-e491-4d68-a5fb-6b8abbbc176b req-a7f2525a-291f-4300-a676-b55049b44942 service nova] Releasing lock "refresh_cache-8f1440c5-e712-4635-9f02-f9cda12da693" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1265.670198] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "8f1440c5-e712-4635-9f02-f9cda12da693" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.799659] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1269.775120] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1269.775336] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1270.692807] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquiring lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1270.693490] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1270.775629] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1270.775629] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances with incomplete migration {{(pid=67977) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1271.780474] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1271.780821] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1272.775070] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1273.775095] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1273.775382] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1273.775599] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1273.775809] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1273.788044] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1273.788044] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1273.788044] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1273.788267] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1273.789568] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b0d00cb-868b-4482-8e02-0af5d134a44f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1273.798837] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcd4f16a-492e-4ea6-be7c-8dcfa1927425 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1273.813439] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a75723b-b917-4f4a-8bc6-672516165dc6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1273.819883] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52fd2299-287f-4323-9f67-3206687cb1c4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1273.850157] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180938MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1273.850315] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1273.850671] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1274.001773] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.001943] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.002088] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.002218] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.002415] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.002468] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.002624] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.002694] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.002785] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.002898] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1274.015092] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.025689] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2eb73f59-f6c3-4816-b545-9ab391697785 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.036811] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 206998ab-992a-4bc0-b176-eb490b3cb479 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.046781] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f66d73a3-535f-45ba-a9c4-2436da6ea511 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.057913] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 120b8d8f-3f0e-4cb4-a112-695d6365788a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.067563] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.081163] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a61121e3-0375-41ef-9dad-0469583640ab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.091477] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8351701c-562c-4ec6-998a-a0b1a62f0c5e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.101154] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ea5d10a1-2beb-4997-beff-e6a0e24e5b9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.112555] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 33412c29-7b03-421f-9502-4c2a639adfd7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.122512] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 508bc9c7-8a3b-4ac8-9eda-b2d3f5916997 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.134610] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8d6894ae-ecbf-4ad2-ae91-463c88ef5de3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.145361] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.155551] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1e05e7be-d468-4908-a1e4-8a11064277b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.165801] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ee368449-13fb-431d-9ae5-f8c08d777336 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.175275] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1274.175433] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1274.175589] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1274.193368] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing inventories for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1274.209528] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating ProviderTree inventory for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1274.209724] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating inventory in ProviderTree for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1274.221972] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing aggregate associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, aggregates: None {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1274.240097] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing trait associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1274.532370] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35ae112a-6080-48ca-b7c0-e0239b0f77b6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1274.540186] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58c65da2-69f0-40d3-bf0e-fbeb6a89234b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1274.569650] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a2bdcb4-02bf-4f11-be2a-892a1943acac {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1274.576812] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fbd510b-dc7e-4c34-b0b7-35ca1d51ad13 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1274.589631] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1274.598689] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1274.612329] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1274.612526] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.762s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1274.776594] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1276.783616] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1276.783902] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1276.783974] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1276.806200] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.806409] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.806549] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.806706] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.806807] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.806933] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.807067] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.807190] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.807315] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.807468] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1276.807596] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1276.808110] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1276.808257] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1276.818958] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] There are 0 instances to clean {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1298.681462] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1298.703482] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Getting list of instances from cluster (obj){ [ 1298.703482] env[67977]: value = "domain-c8" [ 1298.703482] env[67977]: _type = "ClusterComputeResource" [ 1298.703482] env[67977]: } {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1298.704754] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e13476a2-62fa-41c4-83c6-d28ea36ce8d5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.722913] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Got total of 10 instances {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1298.723114] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 83b04c8c-39f6-4f58-b965-0a94c063b68b {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.723312] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.723468] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 48d09ae0-ab95-45e8-a916-ecf24abb66a0 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.723617] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid d7719b11-cef7-4878-a693-24dcd085a1d7 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.723765] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 6e2f1b5e-7bdc-463d-9822-810f99b81623 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.723911] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid eae30b17-eea5-46aa-bb09-91ebca29ea6d {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.724072] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.724226] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 32d860b3-f438-400f-8296-e62cc662d618 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.724369] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 27743458-4ef0-4ceb-a0bf-cac219dbdc35 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.724511] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 8f1440c5-e712-4635-9f02-f9cda12da693 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1298.724830] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "83b04c8c-39f6-4f58-b965-0a94c063b68b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.725072] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.725276] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.725507] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "d7719b11-cef7-4878-a693-24dcd085a1d7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.725740] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "6e2f1b5e-7bdc-463d-9822-810f99b81623" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.725935] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.726147] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.726343] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "32d860b3-f438-400f-8296-e62cc662d618" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.726550] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.726743] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "8f1440c5-e712-4635-9f02-f9cda12da693" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1299.177605] env[67977]: WARNING oslo_vmware.rw_handles [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1299.177605] env[67977]: ERROR oslo_vmware.rw_handles [ 1299.178108] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/988464e9-70c0-42b6-841f-7fb66c1b358c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1299.180654] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1299.180903] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Copying Virtual Disk [datastore1] vmware_temp/988464e9-70c0-42b6-841f-7fb66c1b358c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/988464e9-70c0-42b6-841f-7fb66c1b358c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1299.181201] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b62bbe23-e3bf-4ffd-9435-1cd4f8d34992 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.189521] env[67977]: DEBUG oslo_vmware.api [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Waiting for the task: (returnval){ [ 1299.189521] env[67977]: value = "task-3468207" [ 1299.189521] env[67977]: _type = "Task" [ 1299.189521] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1299.197402] env[67977]: DEBUG oslo_vmware.api [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Task: {'id': task-3468207, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1299.699755] env[67977]: DEBUG oslo_vmware.exceptions [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1299.700090] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1299.700493] env[67977]: ERROR nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1299.700493] env[67977]: Faults: ['InvalidArgument'] [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Traceback (most recent call last): [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] yield resources [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] self.driver.spawn(context, instance, image_meta, [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] self._fetch_image_if_missing(context, vi) [ 1299.700493] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] image_cache(vi, tmp_image_ds_loc) [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] vm_util.copy_virtual_disk( [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] session._wait_for_task(vmdk_copy_task) [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] return self.wait_for_task(task_ref) [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] return evt.wait() [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] result = hub.switch() [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1299.700914] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] return self.greenlet.switch() [ 1299.701374] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1299.701374] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] self.f(*self.args, **self.kw) [ 1299.701374] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1299.701374] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] raise exceptions.translate_fault(task_info.error) [ 1299.701374] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1299.701374] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Faults: ['InvalidArgument'] [ 1299.701374] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] [ 1299.701374] env[67977]: INFO nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Terminating instance [ 1299.702430] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1299.702639] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1299.702883] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-53526d78-6fc5-44e8-b9e4-2f72a2929811 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.705152] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1299.705344] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1299.706083] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ba31154-0b68-42b1-ba17-2657a8c28061 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.713171] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1299.713380] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-63af74cc-17d0-4d1a-8508-80b137477ad4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.715499] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1299.715674] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1299.716606] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c918e1af-bf21-4c01-8715-a3abd9e8fdd0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.722954] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for the task: (returnval){ [ 1299.722954] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5291f7b1-dc3e-e656-c1e4-9680ebe63151" [ 1299.722954] env[67977]: _type = "Task" [ 1299.722954] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1299.730223] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5291f7b1-dc3e-e656-c1e4-9680ebe63151, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1299.787340] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1299.787553] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1299.787733] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Deleting the datastore file [datastore1] 83b04c8c-39f6-4f58-b965-0a94c063b68b {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1299.787999] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9e26c506-e7aa-4d5d-83be-72e30642e472 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.794314] env[67977]: DEBUG oslo_vmware.api [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Waiting for the task: (returnval){ [ 1299.794314] env[67977]: value = "task-3468209" [ 1299.794314] env[67977]: _type = "Task" [ 1299.794314] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1299.801820] env[67977]: DEBUG oslo_vmware.api [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Task: {'id': task-3468209, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1300.232318] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1300.232572] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Creating directory with path [datastore1] vmware_temp/b283aff3-a7f4-4641-bc3c-98ee0122d698/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1300.232802] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-428524df-2b1d-481f-9fac-188f22fd1fbc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.244564] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Created directory with path [datastore1] vmware_temp/b283aff3-a7f4-4641-bc3c-98ee0122d698/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1300.244750] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Fetch image to [datastore1] vmware_temp/b283aff3-a7f4-4641-bc3c-98ee0122d698/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1300.244919] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/b283aff3-a7f4-4641-bc3c-98ee0122d698/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1300.245615] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57eeebb3-5970-48be-a004-5bc437de4b6b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.251976] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de039b7e-3803-4279-a08b-ab6786631b2f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.260644] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-232c05fd-9b62-4b31-ac26-c76c18cdecd9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.291449] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95b42a4f-712b-4a35-b239-49318e6ae8b5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.299477] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0aca7b70-a797-4c27-b677-510563b00c70 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.303584] env[67977]: DEBUG oslo_vmware.api [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Task: {'id': task-3468209, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070001} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1300.304118] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1300.304308] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1300.304477] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1300.304645] env[67977]: INFO nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1300.306702] env[67977]: DEBUG nova.compute.claims [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1300.306890] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.307127] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.327092] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1300.376746] env[67977]: DEBUG oslo_vmware.rw_handles [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b283aff3-a7f4-4641-bc3c-98ee0122d698/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1300.436385] env[67977]: DEBUG oslo_vmware.rw_handles [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1300.436574] env[67977]: DEBUG oslo_vmware.rw_handles [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b283aff3-a7f4-4641-bc3c-98ee0122d698/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1300.683946] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32bba81b-7cec-4df6-a637-af2e9e2a6c66 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.691635] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3453390b-ae67-435d-8c20-0ba08b39a5b3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.720514] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c3569a8-0cdb-4356-889f-64004c0828f0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.727475] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88bf8b08-3bdf-4a1c-b45a-421e7a840ed2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.741229] env[67977]: DEBUG nova.compute.provider_tree [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1300.751070] env[67977]: DEBUG nova.scheduler.client.report [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1300.764804] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.458s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.765571] env[67977]: ERROR nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1300.765571] env[67977]: Faults: ['InvalidArgument'] [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Traceback (most recent call last): [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] self.driver.spawn(context, instance, image_meta, [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] self._fetch_image_if_missing(context, vi) [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] image_cache(vi, tmp_image_ds_loc) [ 1300.765571] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] vm_util.copy_virtual_disk( [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] session._wait_for_task(vmdk_copy_task) [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] return self.wait_for_task(task_ref) [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] return evt.wait() [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] result = hub.switch() [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] return self.greenlet.switch() [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1300.765980] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] self.f(*self.args, **self.kw) [ 1300.766402] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1300.766402] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] raise exceptions.translate_fault(task_info.error) [ 1300.766402] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1300.766402] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Faults: ['InvalidArgument'] [ 1300.766402] env[67977]: ERROR nova.compute.manager [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] [ 1300.766580] env[67977]: DEBUG nova.compute.utils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1300.771049] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Build of instance 83b04c8c-39f6-4f58-b965-0a94c063b68b was re-scheduled: A specified parameter was not correct: fileType [ 1300.771049] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1300.771049] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1300.771237] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1300.771406] env[67977]: DEBUG nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1300.771642] env[67977]: DEBUG nova.network.neutron [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1301.142247] env[67977]: DEBUG nova.network.neutron [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1301.155741] env[67977]: INFO nova.compute.manager [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Took 0.38 seconds to deallocate network for instance. [ 1301.251532] env[67977]: INFO nova.scheduler.client.report [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Deleted allocations for instance 83b04c8c-39f6-4f58-b965-0a94c063b68b [ 1301.274094] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e8354dbd-1b29-469c-a050-d7cbd71a739b tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 559.643s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1301.274323] env[67977]: DEBUG oslo_concurrency.lockutils [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 361.359s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1301.274421] env[67977]: DEBUG oslo_concurrency.lockutils [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Acquiring lock "83b04c8c-39f6-4f58-b965-0a94c063b68b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1301.274598] env[67977]: DEBUG oslo_concurrency.lockutils [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1301.274756] env[67977]: DEBUG oslo_concurrency.lockutils [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1301.278287] env[67977]: INFO nova.compute.manager [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Terminating instance [ 1301.278396] env[67977]: DEBUG nova.compute.manager [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1301.278578] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1301.279380] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8c580440-ddc6-4079-b899-dd97a444c66c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.289521] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61de4157-5aed-4e54-9e30-1e3692d1713a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.306865] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1301.318878] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 83b04c8c-39f6-4f58-b965-0a94c063b68b could not be found. [ 1301.319122] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1301.323190] env[67977]: INFO nova.compute.manager [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1301.323476] env[67977]: DEBUG oslo.service.loopingcall [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1301.324119] env[67977]: DEBUG nova.compute.manager [-] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1301.324222] env[67977]: DEBUG nova.network.neutron [-] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1301.354945] env[67977]: DEBUG nova.network.neutron [-] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1301.356422] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1301.356660] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1301.358171] env[67977]: INFO nova.compute.claims [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1301.362077] env[67977]: INFO nova.compute.manager [-] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] Took 0.04 seconds to deallocate network for instance. [ 1301.447328] env[67977]: DEBUG oslo_concurrency.lockutils [None req-19abb8f9-60bf-403e-81a2-7def526952d8 tempest-ServersTestManualDisk-987739487 tempest-ServersTestManualDisk-987739487-project-member] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.173s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1301.448281] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 2.723s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1301.448485] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 83b04c8c-39f6-4f58-b965-0a94c063b68b] During sync_power_state the instance has a pending task (deleting). Skip. [ 1301.448758] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "83b04c8c-39f6-4f58-b965-0a94c063b68b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1301.672887] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c6b458f-f5da-4707-87dc-a47c773da2e7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.680990] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7239801e-8633-43f5-bf51-8654ccf161df {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.710303] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3180666d-0373-44bb-870a-7b852b85e3c8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.717208] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffc4d344-92e9-4fa6-9a1e-396e503f9cef {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.730175] env[67977]: DEBUG nova.compute.provider_tree [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1301.740275] env[67977]: DEBUG nova.scheduler.client.report [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1301.753646] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.397s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1301.754012] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1301.786541] env[67977]: DEBUG nova.compute.utils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1301.787785] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1301.787958] env[67977]: DEBUG nova.network.neutron [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1301.798026] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1301.858370] env[67977]: DEBUG nova.policy [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4b11845d19f949ec9f5011fb32430517', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c693200ece7542d1b51db597c96768eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1301.862472] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1301.889493] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1301.889775] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1301.889941] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1301.890140] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1301.890292] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1301.890431] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1301.894027] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1301.894027] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1301.894027] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1301.894027] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1301.894027] env[67977]: DEBUG nova.virt.hardware [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1301.894442] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa32e868-fe14-417e-bb8b-e7ff36002c16 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.901198] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd2e7f14-d113-4dfe-854f-45301f6ec500 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.205793] env[67977]: DEBUG nova.network.neutron [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Successfully created port: e37ac130-546e-472c-83b1-1d77955a8386 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1302.863259] env[67977]: DEBUG nova.network.neutron [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Successfully updated port: e37ac130-546e-472c-83b1-1d77955a8386 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1302.876201] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "refresh_cache-f03fe248-75df-4237-a6dd-cc49012c2331" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1302.876414] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquired lock "refresh_cache-f03fe248-75df-4237-a6dd-cc49012c2331" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1302.876641] env[67977]: DEBUG nova.network.neutron [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1302.925894] env[67977]: DEBUG nova.network.neutron [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1303.350297] env[67977]: DEBUG nova.network.neutron [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Updating instance_info_cache with network_info: [{"id": "e37ac130-546e-472c-83b1-1d77955a8386", "address": "fa:16:3e:9f:41:9d", "network": {"id": "cbd12706-d748-47e2-84b6-d528ee8d4a61", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-955781397-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c693200ece7542d1b51db597c96768eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "680cb499-2a47-482b-af0d-112016ac0e17", "external-id": "nsx-vlan-transportzone-644", "segmentation_id": 644, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape37ac130-54", "ovs_interfaceid": "e37ac130-546e-472c-83b1-1d77955a8386", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1303.362509] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Releasing lock "refresh_cache-f03fe248-75df-4237-a6dd-cc49012c2331" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1303.362807] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Instance network_info: |[{"id": "e37ac130-546e-472c-83b1-1d77955a8386", "address": "fa:16:3e:9f:41:9d", "network": {"id": "cbd12706-d748-47e2-84b6-d528ee8d4a61", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-955781397-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c693200ece7542d1b51db597c96768eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "680cb499-2a47-482b-af0d-112016ac0e17", "external-id": "nsx-vlan-transportzone-644", "segmentation_id": 644, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape37ac130-54", "ovs_interfaceid": "e37ac130-546e-472c-83b1-1d77955a8386", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1303.363375] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9f:41:9d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '680cb499-2a47-482b-af0d-112016ac0e17', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e37ac130-546e-472c-83b1-1d77955a8386', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1303.370799] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Creating folder: Project (c693200ece7542d1b51db597c96768eb). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1303.371365] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7c337344-8f5f-4fb2-8b82-98554e582692 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.381842] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Created folder: Project (c693200ece7542d1b51db597c96768eb) in parent group-v693022. [ 1303.382041] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Creating folder: Instances. Parent ref: group-v693092. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1303.382269] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-27171f18-b954-44cd-8843-3c3158ebbebb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.391118] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Created folder: Instances in parent group-v693092. [ 1303.391183] env[67977]: DEBUG oslo.service.loopingcall [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1303.391351] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1303.391535] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-442e6141-4ac2-4476-894f-0b5cdf8763c0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.410712] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1303.410712] env[67977]: value = "task-3468212" [ 1303.410712] env[67977]: _type = "Task" [ 1303.410712] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1303.416837] env[67977]: DEBUG nova.compute.manager [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Received event network-vif-plugged-e37ac130-546e-472c-83b1-1d77955a8386 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1303.417061] env[67977]: DEBUG oslo_concurrency.lockutils [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] Acquiring lock "f03fe248-75df-4237-a6dd-cc49012c2331-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1303.417319] env[67977]: DEBUG oslo_concurrency.lockutils [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] Lock "f03fe248-75df-4237-a6dd-cc49012c2331-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1303.417431] env[67977]: DEBUG oslo_concurrency.lockutils [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] Lock "f03fe248-75df-4237-a6dd-cc49012c2331-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1303.417624] env[67977]: DEBUG nova.compute.manager [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] No waiting events found dispatching network-vif-plugged-e37ac130-546e-472c-83b1-1d77955a8386 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1303.417773] env[67977]: WARNING nova.compute.manager [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Received unexpected event network-vif-plugged-e37ac130-546e-472c-83b1-1d77955a8386 for instance with vm_state building and task_state spawning. [ 1303.417932] env[67977]: DEBUG nova.compute.manager [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Received event network-changed-e37ac130-546e-472c-83b1-1d77955a8386 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1303.418133] env[67977]: DEBUG nova.compute.manager [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Refreshing instance network info cache due to event network-changed-e37ac130-546e-472c-83b1-1d77955a8386. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1303.418329] env[67977]: DEBUG oslo_concurrency.lockutils [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] Acquiring lock "refresh_cache-f03fe248-75df-4237-a6dd-cc49012c2331" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1303.418466] env[67977]: DEBUG oslo_concurrency.lockutils [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] Acquired lock "refresh_cache-f03fe248-75df-4237-a6dd-cc49012c2331" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1303.418622] env[67977]: DEBUG nova.network.neutron [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Refreshing network info cache for port e37ac130-546e-472c-83b1-1d77955a8386 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1303.423485] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468212, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1303.773295] env[67977]: DEBUG nova.network.neutron [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Updated VIF entry in instance network info cache for port e37ac130-546e-472c-83b1-1d77955a8386. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1303.773669] env[67977]: DEBUG nova.network.neutron [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Updating instance_info_cache with network_info: [{"id": "e37ac130-546e-472c-83b1-1d77955a8386", "address": "fa:16:3e:9f:41:9d", "network": {"id": "cbd12706-d748-47e2-84b6-d528ee8d4a61", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-955781397-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c693200ece7542d1b51db597c96768eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "680cb499-2a47-482b-af0d-112016ac0e17", "external-id": "nsx-vlan-transportzone-644", "segmentation_id": 644, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape37ac130-54", "ovs_interfaceid": "e37ac130-546e-472c-83b1-1d77955a8386", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1303.783012] env[67977]: DEBUG oslo_concurrency.lockutils [req-9a382a00-7890-42c5-af2c-263ca9347234 req-a46d12df-f78d-4ab4-8ea2-f12ca6e5f831 service nova] Releasing lock "refresh_cache-f03fe248-75df-4237-a6dd-cc49012c2331" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1303.921028] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468212, 'name': CreateVM_Task, 'duration_secs': 0.441716} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1303.921337] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1303.921889] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1303.922096] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1303.922429] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1303.922677] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9062100c-8dbf-4700-a7d9-718873fe5506 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.927053] env[67977]: DEBUG oslo_vmware.api [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for the task: (returnval){ [ 1303.927053] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]522e61ee-9b40-b6f1-7cba-495b1916c955" [ 1303.927053] env[67977]: _type = "Task" [ 1303.927053] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1303.934492] env[67977]: DEBUG oslo_vmware.api [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]522e61ee-9b40-b6f1-7cba-495b1916c955, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1304.437203] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1304.437482] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1304.437693] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1311.768610] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "f03fe248-75df-4237-a6dd-cc49012c2331" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1331.775660] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1331.775660] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1332.770606] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1332.775301] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1333.775575] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1333.775877] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1334.775286] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1334.775528] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1335.777372] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1335.808990] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1335.809254] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1335.809443] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1335.809619] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1335.810767] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31d7ee1c-2a44-47f2-853e-3b4b911a120b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.821782] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91dd93f3-436b-4b53-9e0e-28aaab5003e1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.837012] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-495bde45-b3bd-411f-9a84-2083d1590dc4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.843923] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99f9576e-ad8d-4de9-892b-8909f1274eae {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.874799] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180934MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1335.875062] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1335.875407] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1335.959687] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.959858] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.959987] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.960124] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.960245] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.960363] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.960506] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.960627] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.960744] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.960863] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1335.976766] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 206998ab-992a-4bc0-b176-eb490b3cb479 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1335.988405] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f66d73a3-535f-45ba-a9c4-2436da6ea511 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.001447] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 120b8d8f-3f0e-4cb4-a112-695d6365788a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.014272] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.028499] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance a61121e3-0375-41ef-9dad-0469583640ab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.040505] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8351701c-562c-4ec6-998a-a0b1a62f0c5e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.052939] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ea5d10a1-2beb-4997-beff-e6a0e24e5b9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.066132] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 33412c29-7b03-421f-9502-4c2a639adfd7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.078108] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 508bc9c7-8a3b-4ac8-9eda-b2d3f5916997 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.089047] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8d6894ae-ecbf-4ad2-ae91-463c88ef5de3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.100497] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.117172] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1e05e7be-d468-4908-a1e4-8a11064277b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.128305] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ee368449-13fb-431d-9ae5-f8c08d777336 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.139530] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1336.139530] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1336.139530] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1336.471942] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66981b7f-1f63-4c2f-a2d3-759c9181b3d5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.480027] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1ebbf5c-80fe-4d1e-898a-bcaffe271a1b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.512870] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bb6749b-6b86-47a6-a2fe-b64a3252a52f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.519820] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4db040ff-3b89-417a-9cdf-2d1ea1418ae8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.534262] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1336.542651] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1336.558886] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1336.559099] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.684s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1339.558578] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1339.558874] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1339.558916] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1339.582471] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.582635] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.582775] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.582904] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.583030] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.583152] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.583349] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.583506] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.583603] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.583723] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1339.583842] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1348.049434] env[67977]: WARNING oslo_vmware.rw_handles [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1348.049434] env[67977]: ERROR oslo_vmware.rw_handles [ 1348.050518] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/b283aff3-a7f4-4641-bc3c-98ee0122d698/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1348.051992] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1348.052445] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Copying Virtual Disk [datastore1] vmware_temp/b283aff3-a7f4-4641-bc3c-98ee0122d698/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/b283aff3-a7f4-4641-bc3c-98ee0122d698/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1348.052808] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5e2c6357-2f0a-41bf-b9b4-0269c0fd8a21 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.061169] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for the task: (returnval){ [ 1348.061169] env[67977]: value = "task-3468213" [ 1348.061169] env[67977]: _type = "Task" [ 1348.061169] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1348.069377] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': task-3468213, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1348.574394] env[67977]: DEBUG oslo_vmware.exceptions [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1348.576573] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1348.576573] env[67977]: ERROR nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1348.576573] env[67977]: Faults: ['InvalidArgument'] [ 1348.576573] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Traceback (most recent call last): [ 1348.576573] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1348.576573] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] yield resources [ 1348.576573] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1348.576573] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] self.driver.spawn(context, instance, image_meta, [ 1348.576573] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1348.576573] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] self._fetch_image_if_missing(context, vi) [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] image_cache(vi, tmp_image_ds_loc) [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] vm_util.copy_virtual_disk( [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] session._wait_for_task(vmdk_copy_task) [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] return self.wait_for_task(task_ref) [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] return evt.wait() [ 1348.577026] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] result = hub.switch() [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] return self.greenlet.switch() [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] self.f(*self.args, **self.kw) [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] raise exceptions.translate_fault(task_info.error) [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Faults: ['InvalidArgument'] [ 1348.577457] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] [ 1348.577457] env[67977]: INFO nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Terminating instance [ 1348.578057] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1348.578317] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1348.578976] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1348.579189] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1348.579411] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0dd7f629-f89b-4ddb-9088-d38d4bda5c62 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.582382] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f454f6d-cb03-4b85-b674-803f9bc4b825 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.589814] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1348.590076] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8b09af6f-5677-4332-9645-8614adad05fb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.592411] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1348.592677] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1348.597708] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d1e02117-df5a-4fd6-b610-2c70014911a9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.606305] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for the task: (returnval){ [ 1348.606305] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52bbef83-7c91-0c5e-f351-b0823467c6f4" [ 1348.606305] env[67977]: _type = "Task" [ 1348.606305] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1348.614189] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52bbef83-7c91-0c5e-f351-b0823467c6f4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1348.668022] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1348.668022] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1348.668022] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Deleting the datastore file [datastore1] e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1348.668022] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-da08aca0-28f1-44ae-bbd4-2063b2b9252e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.673205] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for the task: (returnval){ [ 1348.673205] env[67977]: value = "task-3468215" [ 1348.673205] env[67977]: _type = "Task" [ 1348.673205] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1348.682962] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': task-3468215, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1349.116841] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1349.117163] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Creating directory with path [datastore1] vmware_temp/46e061c2-116e-4068-a7fe-a96e613690fb/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1349.117409] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a52de6f7-cd0d-4b31-b54f-5212d02074bd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.128540] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Created directory with path [datastore1] vmware_temp/46e061c2-116e-4068-a7fe-a96e613690fb/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1349.128734] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Fetch image to [datastore1] vmware_temp/46e061c2-116e-4068-a7fe-a96e613690fb/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1349.128906] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/46e061c2-116e-4068-a7fe-a96e613690fb/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1349.129706] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8056898e-50f4-4892-bd54-d43f2e0c8598 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.136277] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4c74127-a37c-4ae2-bbf1-4c775c73a481 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.145231] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25d6a787-ed3d-47ff-9c97-45cb3bea981d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.181046] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-474bdfef-26c3-4d8c-a353-5a473f566f33 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.187756] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': task-3468215, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078825} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1349.189311] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1349.189521] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1349.189721] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1349.189901] env[67977]: INFO nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1349.191992] env[67977]: DEBUG nova.compute.claims [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1349.192191] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1349.192410] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1349.194871] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-93f13824-2e28-4f2c-aa6c-6a36b9615052 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.218600] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1349.446904] env[67977]: DEBUG oslo_vmware.rw_handles [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/46e061c2-116e-4068-a7fe-a96e613690fb/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1349.507809] env[67977]: DEBUG oslo_vmware.rw_handles [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1349.507987] env[67977]: DEBUG oslo_vmware.rw_handles [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/46e061c2-116e-4068-a7fe-a96e613690fb/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1349.551666] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95e670b3-9f0c-48f5-9515-5a08238c4e51 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.559925] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-351d53ba-903f-4d14-977c-443576432af6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.590533] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48d73a0f-220b-4461-8b8e-e05bcd33eca7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.598170] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61aa3e51-f34a-4cc2-8d12-afb9d60fa009 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.614296] env[67977]: DEBUG nova.compute.provider_tree [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1349.623887] env[67977]: DEBUG nova.scheduler.client.report [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1349.638853] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.446s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1349.639416] env[67977]: ERROR nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1349.639416] env[67977]: Faults: ['InvalidArgument'] [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Traceback (most recent call last): [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] self.driver.spawn(context, instance, image_meta, [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] self._fetch_image_if_missing(context, vi) [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] image_cache(vi, tmp_image_ds_loc) [ 1349.639416] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] vm_util.copy_virtual_disk( [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] session._wait_for_task(vmdk_copy_task) [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] return self.wait_for_task(task_ref) [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] return evt.wait() [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] result = hub.switch() [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] return self.greenlet.switch() [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1349.640019] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] self.f(*self.args, **self.kw) [ 1349.640408] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1349.640408] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] raise exceptions.translate_fault(task_info.error) [ 1349.640408] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1349.640408] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Faults: ['InvalidArgument'] [ 1349.640408] env[67977]: ERROR nova.compute.manager [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] [ 1349.640408] env[67977]: DEBUG nova.compute.utils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1349.641726] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Build of instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 was re-scheduled: A specified parameter was not correct: fileType [ 1349.641726] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1349.642121] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1349.642296] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1349.642463] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1349.642622] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1350.055192] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1350.069138] env[67977]: INFO nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Took 0.43 seconds to deallocate network for instance. [ 1350.168968] env[67977]: INFO nova.scheduler.client.report [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Deleted allocations for instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 [ 1350.191830] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 607.425s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.193609] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 407.278s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1350.193609] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1350.193917] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1350.194071] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.196084] env[67977]: INFO nova.compute.manager [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Terminating instance [ 1350.197970] env[67977]: DEBUG nova.compute.manager [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1350.198184] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1350.198664] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c6d6de14-3412-4c98-ad70-ed01850e70e4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.210058] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d1454aa-e59b-43fb-b31a-de3121d75d12 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.221653] env[67977]: DEBUG nova.compute.manager [None req-fa2ab9d9-0aa9-4f0c-a48e-bb0c1ec47dba tempest-ServerShowV254Test-1567276119 tempest-ServerShowV254Test-1567276119-project-member] [instance: 2eb73f59-f6c3-4816-b545-9ab391697785] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1350.246013] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e3983ca6-b3d0-460c-aebc-1a4d2c8b9991 could not be found. [ 1350.246013] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1350.246013] env[67977]: INFO nova.compute.manager [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1350.246013] env[67977]: DEBUG oslo.service.loopingcall [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1350.246013] env[67977]: DEBUG nova.compute.manager [-] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1350.246251] env[67977]: DEBUG nova.network.neutron [-] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1350.252311] env[67977]: DEBUG nova.compute.manager [None req-fa2ab9d9-0aa9-4f0c-a48e-bb0c1ec47dba tempest-ServerShowV254Test-1567276119 tempest-ServerShowV254Test-1567276119-project-member] [instance: 2eb73f59-f6c3-4816-b545-9ab391697785] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1350.276794] env[67977]: DEBUG nova.network.neutron [-] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1350.278581] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fa2ab9d9-0aa9-4f0c-a48e-bb0c1ec47dba tempest-ServerShowV254Test-1567276119 tempest-ServerShowV254Test-1567276119-project-member] Lock "2eb73f59-f6c3-4816-b545-9ab391697785" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.949s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.284913] env[67977]: INFO nova.compute.manager [-] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] Took 0.04 seconds to deallocate network for instance. [ 1350.290061] env[67977]: DEBUG nova.compute.manager [None req-ac88ae81-04db-4474-8c3e-4da498e6468e tempest-ServerGroupTestJSON-1257420963 tempest-ServerGroupTestJSON-1257420963-project-member] [instance: 206998ab-992a-4bc0-b176-eb490b3cb479] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1350.313087] env[67977]: DEBUG nova.compute.manager [None req-ac88ae81-04db-4474-8c3e-4da498e6468e tempest-ServerGroupTestJSON-1257420963 tempest-ServerGroupTestJSON-1257420963-project-member] [instance: 206998ab-992a-4bc0-b176-eb490b3cb479] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1350.337699] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ac88ae81-04db-4474-8c3e-4da498e6468e tempest-ServerGroupTestJSON-1257420963 tempest-ServerGroupTestJSON-1257420963-project-member] Lock "206998ab-992a-4bc0-b176-eb490b3cb479" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.288s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.346368] env[67977]: DEBUG nova.compute.manager [None req-6ddf2ff5-da05-4281-aa65-ed713584cc3e tempest-ServerMetadataTestJSON-580247377 tempest-ServerMetadataTestJSON-580247377-project-member] [instance: f66d73a3-535f-45ba-a9c4-2436da6ea511] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1350.374912] env[67977]: DEBUG nova.compute.manager [None req-6ddf2ff5-da05-4281-aa65-ed713584cc3e tempest-ServerMetadataTestJSON-580247377 tempest-ServerMetadataTestJSON-580247377-project-member] [instance: f66d73a3-535f-45ba-a9c4-2436da6ea511] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1350.393779] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b2a0815c-bf12-4613-a13c-1118ffb04358 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.200s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.394708] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 51.670s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1350.394899] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e3983ca6-b3d0-460c-aebc-1a4d2c8b9991] During sync_power_state the instance has a pending task (deleting). Skip. [ 1350.395085] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "e3983ca6-b3d0-460c-aebc-1a4d2c8b9991" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.399618] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6ddf2ff5-da05-4281-aa65-ed713584cc3e tempest-ServerMetadataTestJSON-580247377 tempest-ServerMetadataTestJSON-580247377-project-member] Lock "f66d73a3-535f-45ba-a9c4-2436da6ea511" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.348s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.407919] env[67977]: DEBUG nova.compute.manager [None req-a19c358d-95ea-4bb3-8813-d105ebab962c tempest-ServerRescueTestJSON-961321 tempest-ServerRescueTestJSON-961321-project-member] [instance: 120b8d8f-3f0e-4cb4-a112-695d6365788a] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1350.429539] env[67977]: DEBUG nova.compute.manager [None req-a19c358d-95ea-4bb3-8813-d105ebab962c tempest-ServerRescueTestJSON-961321 tempest-ServerRescueTestJSON-961321-project-member] [instance: 120b8d8f-3f0e-4cb4-a112-695d6365788a] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1350.449476] env[67977]: DEBUG oslo_concurrency.lockutils [None req-a19c358d-95ea-4bb3-8813-d105ebab962c tempest-ServerRescueTestJSON-961321 tempest-ServerRescueTestJSON-961321-project-member] Lock "120b8d8f-3f0e-4cb4-a112-695d6365788a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.186s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.458279] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1350.504742] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1350.504993] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1350.506385] env[67977]: INFO nova.compute.claims [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1350.777276] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8438e71e-16cc-4a5e-b1ea-893aa55b7f10 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.784714] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a407993-1050-4206-8a00-a81e0330c4dd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.814025] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1f6e6aa-c0ce-4d2f-8e64-7594057e7816 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.821438] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dac93be-1cd4-4f70-8814-a21354388835 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.835094] env[67977]: DEBUG nova.compute.provider_tree [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1350.845020] env[67977]: DEBUG nova.scheduler.client.report [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1350.861376] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.861842] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1350.896842] env[67977]: DEBUG nova.compute.utils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1350.899606] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1350.899606] env[67977]: DEBUG nova.network.neutron [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1350.935776] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1350.967761] env[67977]: DEBUG nova.policy [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78df84566c65469890b3b6f15f3e5e01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ff581ae563e45108f497cade6990d79', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1351.004520] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1351.033999] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1351.034259] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1351.034414] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1351.034593] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1351.034758] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1351.035090] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1351.035162] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1351.035279] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1351.035756] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1351.035756] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1351.035963] env[67977]: DEBUG nova.virt.hardware [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1351.036874] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b797bf04-6c4b-46f7-9e9e-cc08111267fe {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.046347] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df4be39f-b183-453c-9cf7-64b8cd54a79d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.350581] env[67977]: DEBUG nova.network.neutron [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Successfully created port: 5c8c0765-4515-4b14-92f3-94a063174874 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1352.099894] env[67977]: DEBUG nova.network.neutron [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Successfully updated port: 5c8c0765-4515-4b14-92f3-94a063174874 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1352.111483] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "refresh_cache-5edda5cc-6295-4abe-a21e-0cf684063cb3" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1352.111614] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "refresh_cache-5edda5cc-6295-4abe-a21e-0cf684063cb3" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1352.111714] env[67977]: DEBUG nova.network.neutron [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1352.149088] env[67977]: DEBUG nova.compute.manager [req-9a94d04b-3b27-4f42-8cfc-5955c5de54de req-38843768-af84-4082-8509-a92b47118d57 service nova] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Received event network-vif-plugged-5c8c0765-4515-4b14-92f3-94a063174874 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1352.149313] env[67977]: DEBUG oslo_concurrency.lockutils [req-9a94d04b-3b27-4f42-8cfc-5955c5de54de req-38843768-af84-4082-8509-a92b47118d57 service nova] Acquiring lock "5edda5cc-6295-4abe-a21e-0cf684063cb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1352.149515] env[67977]: DEBUG oslo_concurrency.lockutils [req-9a94d04b-3b27-4f42-8cfc-5955c5de54de req-38843768-af84-4082-8509-a92b47118d57 service nova] Lock "5edda5cc-6295-4abe-a21e-0cf684063cb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1352.149704] env[67977]: DEBUG oslo_concurrency.lockutils [req-9a94d04b-3b27-4f42-8cfc-5955c5de54de req-38843768-af84-4082-8509-a92b47118d57 service nova] Lock "5edda5cc-6295-4abe-a21e-0cf684063cb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1352.149877] env[67977]: DEBUG nova.compute.manager [req-9a94d04b-3b27-4f42-8cfc-5955c5de54de req-38843768-af84-4082-8509-a92b47118d57 service nova] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] No waiting events found dispatching network-vif-plugged-5c8c0765-4515-4b14-92f3-94a063174874 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1352.150051] env[67977]: WARNING nova.compute.manager [req-9a94d04b-3b27-4f42-8cfc-5955c5de54de req-38843768-af84-4082-8509-a92b47118d57 service nova] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Received unexpected event network-vif-plugged-5c8c0765-4515-4b14-92f3-94a063174874 for instance with vm_state building and task_state spawning. [ 1352.162966] env[67977]: DEBUG nova.network.neutron [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1352.346676] env[67977]: DEBUG nova.network.neutron [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Updating instance_info_cache with network_info: [{"id": "5c8c0765-4515-4b14-92f3-94a063174874", "address": "fa:16:3e:bf:46:d2", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c8c0765-45", "ovs_interfaceid": "5c8c0765-4515-4b14-92f3-94a063174874", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1352.357127] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "refresh_cache-5edda5cc-6295-4abe-a21e-0cf684063cb3" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1352.357421] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Instance network_info: |[{"id": "5c8c0765-4515-4b14-92f3-94a063174874", "address": "fa:16:3e:bf:46:d2", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c8c0765-45", "ovs_interfaceid": "5c8c0765-4515-4b14-92f3-94a063174874", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1352.357840] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bf:46:d2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5efce30e-48dd-493a-a354-f562a8adf7af', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5c8c0765-4515-4b14-92f3-94a063174874', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1352.366077] env[67977]: DEBUG oslo.service.loopingcall [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1352.366077] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1352.366077] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-55e1365c-b03c-4da5-a614-6ca84df18157 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1352.387213] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1352.387213] env[67977]: value = "task-3468216" [ 1352.387213] env[67977]: _type = "Task" [ 1352.387213] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1352.397900] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468216, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1352.898082] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468216, 'name': CreateVM_Task, 'duration_secs': 0.279975} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1352.898261] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1352.905190] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1352.905317] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1352.905637] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1352.905881] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7f6ab996-a211-4ef7-8616-b6b0d27cb577 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1352.910526] env[67977]: DEBUG oslo_vmware.api [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 1352.910526] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525fa235-616d-8cdf-1853-be3674833fe9" [ 1352.910526] env[67977]: _type = "Task" [ 1352.910526] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1352.919247] env[67977]: DEBUG oslo_vmware.api [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525fa235-616d-8cdf-1853-be3674833fe9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1353.420931] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1353.422135] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1353.422135] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1354.194886] env[67977]: DEBUG nova.compute.manager [req-2407520e-59d6-405c-9eb3-2566a4860d99 req-581d9057-bbaa-47a5-97f1-2ae2e5529d39 service nova] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Received event network-changed-5c8c0765-4515-4b14-92f3-94a063174874 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1354.195110] env[67977]: DEBUG nova.compute.manager [req-2407520e-59d6-405c-9eb3-2566a4860d99 req-581d9057-bbaa-47a5-97f1-2ae2e5529d39 service nova] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Refreshing instance network info cache due to event network-changed-5c8c0765-4515-4b14-92f3-94a063174874. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1354.195335] env[67977]: DEBUG oslo_concurrency.lockutils [req-2407520e-59d6-405c-9eb3-2566a4860d99 req-581d9057-bbaa-47a5-97f1-2ae2e5529d39 service nova] Acquiring lock "refresh_cache-5edda5cc-6295-4abe-a21e-0cf684063cb3" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1354.195522] env[67977]: DEBUG oslo_concurrency.lockutils [req-2407520e-59d6-405c-9eb3-2566a4860d99 req-581d9057-bbaa-47a5-97f1-2ae2e5529d39 service nova] Acquired lock "refresh_cache-5edda5cc-6295-4abe-a21e-0cf684063cb3" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1354.195701] env[67977]: DEBUG nova.network.neutron [req-2407520e-59d6-405c-9eb3-2566a4860d99 req-581d9057-bbaa-47a5-97f1-2ae2e5529d39 service nova] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Refreshing network info cache for port 5c8c0765-4515-4b14-92f3-94a063174874 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1354.458300] env[67977]: DEBUG nova.network.neutron [req-2407520e-59d6-405c-9eb3-2566a4860d99 req-581d9057-bbaa-47a5-97f1-2ae2e5529d39 service nova] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Updated VIF entry in instance network info cache for port 5c8c0765-4515-4b14-92f3-94a063174874. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1354.458726] env[67977]: DEBUG nova.network.neutron [req-2407520e-59d6-405c-9eb3-2566a4860d99 req-581d9057-bbaa-47a5-97f1-2ae2e5529d39 service nova] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Updating instance_info_cache with network_info: [{"id": "5c8c0765-4515-4b14-92f3-94a063174874", "address": "fa:16:3e:bf:46:d2", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c8c0765-45", "ovs_interfaceid": "5c8c0765-4515-4b14-92f3-94a063174874", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1354.468809] env[67977]: DEBUG oslo_concurrency.lockutils [req-2407520e-59d6-405c-9eb3-2566a4860d99 req-581d9057-bbaa-47a5-97f1-2ae2e5529d39 service nova] Releasing lock "refresh_cache-5edda5cc-6295-4abe-a21e-0cf684063cb3" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1360.150913] env[67977]: DEBUG oslo_concurrency.lockutils [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "5edda5cc-6295-4abe-a21e-0cf684063cb3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.093897] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "6fae5126-6618-4337-9a52-d6019727e0b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.094200] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "6fae5126-6618-4337-9a52-d6019727e0b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1380.940633] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "b56ab7a8-cd27-4542-8082-ec023c57e153" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1380.940898] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1392.774689] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1392.799206] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1392.799365] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1393.775438] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1393.775756] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1394.771323] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1394.775952] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1395.350097] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1395.350750] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1395.774588] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1395.774849] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1396.775324] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1396.787122] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1396.787364] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1396.787534] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1396.787689] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1396.788924] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f7fdb06-f187-409f-a606-64d09b5c8c3a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1396.799022] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee21d894-5b83-472c-96a4-48d427b51b01 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1396.811837] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d68f1dc-dd57-4802-8955-5cda10c378ab {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1396.817797] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4509f619-b8cf-459e-9c88-c2a93566bac1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1396.846502] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180916MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1396.846654] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1396.846839] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1396.920500] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.920753] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d7719b11-cef7-4878-a693-24dcd085a1d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.920927] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.921087] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.921262] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.921395] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.921585] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.921758] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.922210] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.922210] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1396.933304] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1396.943624] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1e05e7be-d468-4908-a1e4-8a11064277b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1396.953388] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ee368449-13fb-431d-9ae5-f8c08d777336 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1396.962760] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1396.972246] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6fae5126-6618-4337-9a52-d6019727e0b0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1396.981069] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1396.990574] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1396.990790] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1396.990936] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1397.199462] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6dd088d-f047-4674-87e8-61a279aa9eac {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.208268] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33b26c50-52fd-4b19-b892-dfa354b71551 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.236982] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e87e9baa-0bb7-4772-b34d-4fbc1c6e8438 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.244074] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5b48f58-9888-4ada-ae3f-090a3ae19d78 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.257225] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1397.265284] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1397.281397] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1397.281397] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.434s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1397.466429] env[67977]: WARNING oslo_vmware.rw_handles [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1397.466429] env[67977]: ERROR oslo_vmware.rw_handles [ 1397.466429] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/46e061c2-116e-4068-a7fe-a96e613690fb/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1397.468643] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1397.468939] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Copying Virtual Disk [datastore1] vmware_temp/46e061c2-116e-4068-a7fe-a96e613690fb/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/46e061c2-116e-4068-a7fe-a96e613690fb/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1397.469272] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5de25c15-cbc3-4d1c-aef0-3324b9a7c8c1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.477683] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for the task: (returnval){ [ 1397.477683] env[67977]: value = "task-3468217" [ 1397.477683] env[67977]: _type = "Task" [ 1397.477683] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1397.486125] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': task-3468217, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1397.988117] env[67977]: DEBUG oslo_vmware.exceptions [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1397.988117] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1397.988719] env[67977]: ERROR nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1397.988719] env[67977]: Faults: ['InvalidArgument'] [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Traceback (most recent call last): [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] yield resources [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] self.driver.spawn(context, instance, image_meta, [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] self._fetch_image_if_missing(context, vi) [ 1397.988719] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] image_cache(vi, tmp_image_ds_loc) [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] vm_util.copy_virtual_disk( [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] session._wait_for_task(vmdk_copy_task) [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] return self.wait_for_task(task_ref) [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] return evt.wait() [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] result = hub.switch() [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1397.988994] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] return self.greenlet.switch() [ 1397.989278] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1397.989278] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] self.f(*self.args, **self.kw) [ 1397.989278] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1397.989278] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] raise exceptions.translate_fault(task_info.error) [ 1397.989278] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1397.989278] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Faults: ['InvalidArgument'] [ 1397.989278] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] [ 1397.989278] env[67977]: INFO nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Terminating instance [ 1397.990638] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1397.990843] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1397.991470] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1397.991653] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1397.991873] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-35e6b3f4-3869-4a66-b7e9-f607a3a22bbb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.994090] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33df4284-19b8-439c-868b-65b5bef47ff8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.001038] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1398.001254] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9221b5c0-9e3f-418b-8880-721363c4a248 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.003256] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1398.003425] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1398.004331] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-847331bc-3df9-4a7f-8c77-08e6362fbbaf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.008870] env[67977]: DEBUG oslo_vmware.api [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 1398.008870] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5235f02b-f20e-76c4-2435-3a5d652c75c8" [ 1398.008870] env[67977]: _type = "Task" [ 1398.008870] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1398.016967] env[67977]: DEBUG oslo_vmware.api [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5235f02b-f20e-76c4-2435-3a5d652c75c8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1398.065761] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1398.065963] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1398.066164] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Deleting the datastore file [datastore1] 48d09ae0-ab95-45e8-a916-ecf24abb66a0 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1398.066421] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-721b750d-eec3-488b-b13a-6644be207671 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.072682] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for the task: (returnval){ [ 1398.072682] env[67977]: value = "task-3468219" [ 1398.072682] env[67977]: _type = "Task" [ 1398.072682] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1398.079732] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': task-3468219, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1398.519472] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1398.519753] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating directory with path [datastore1] vmware_temp/f0cd65c5-0b47-4f12-82be-da4fe87c0ea0/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1398.519993] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f22fba38-906f-4901-adc2-097796e94175 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.531221] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created directory with path [datastore1] vmware_temp/f0cd65c5-0b47-4f12-82be-da4fe87c0ea0/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1398.531411] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Fetch image to [datastore1] vmware_temp/f0cd65c5-0b47-4f12-82be-da4fe87c0ea0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1398.531583] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/f0cd65c5-0b47-4f12-82be-da4fe87c0ea0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1398.532321] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95e1d018-44e6-4617-a96b-fc5c2dc82dae {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.538820] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e2682aa-87f6-4529-acd6-b8eb4d19fae8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.547454] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34700ba8-020f-4210-a0a2-53815edce879 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.580548] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9976e0cf-728b-4d19-a62c-fc803b258765 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.588777] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-94a2c82f-a60f-401a-8c6a-6c85c2dd0484 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.590550] env[67977]: DEBUG oslo_vmware.api [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Task: {'id': task-3468219, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076507} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1398.590778] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1398.590952] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1398.591134] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1398.591327] env[67977]: INFO nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1398.593525] env[67977]: DEBUG nova.compute.claims [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1398.593711] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1398.593925] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1398.614515] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1398.665517] env[67977]: DEBUG oslo_vmware.rw_handles [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f0cd65c5-0b47-4f12-82be-da4fe87c0ea0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1398.727362] env[67977]: DEBUG oslo_vmware.rw_handles [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1398.727563] env[67977]: DEBUG oslo_vmware.rw_handles [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f0cd65c5-0b47-4f12-82be-da4fe87c0ea0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1398.896226] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47dbc426-7d71-4680-bd9d-094033993fa1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.904101] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd1cc1e7-b0e3-4dc2-929d-2e537d4a96b3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.933793] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5030e6ad-91a7-4cf4-8c1f-c019591f6441 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.940475] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fccfc0d7-356f-41b0-ba33-14e7ffffb253 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.953885] env[67977]: DEBUG nova.compute.provider_tree [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1398.962382] env[67977]: DEBUG nova.scheduler.client.report [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1398.976062] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.382s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1398.976584] env[67977]: ERROR nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1398.976584] env[67977]: Faults: ['InvalidArgument'] [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Traceback (most recent call last): [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] self.driver.spawn(context, instance, image_meta, [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] self._fetch_image_if_missing(context, vi) [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] image_cache(vi, tmp_image_ds_loc) [ 1398.976584] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] vm_util.copy_virtual_disk( [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] session._wait_for_task(vmdk_copy_task) [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] return self.wait_for_task(task_ref) [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] return evt.wait() [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] result = hub.switch() [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] return self.greenlet.switch() [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1398.976929] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] self.f(*self.args, **self.kw) [ 1398.977253] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1398.977253] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] raise exceptions.translate_fault(task_info.error) [ 1398.977253] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1398.977253] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Faults: ['InvalidArgument'] [ 1398.977253] env[67977]: ERROR nova.compute.manager [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] [ 1398.977381] env[67977]: DEBUG nova.compute.utils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1398.978781] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Build of instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 was re-scheduled: A specified parameter was not correct: fileType [ 1398.978781] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1398.979177] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1398.979354] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1398.979527] env[67977]: DEBUG nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1398.979692] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1399.511642] env[67977]: DEBUG nova.network.neutron [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1399.523151] env[67977]: INFO nova.compute.manager [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Took 0.54 seconds to deallocate network for instance. [ 1399.632043] env[67977]: INFO nova.scheduler.client.report [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Deleted allocations for instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 [ 1399.651693] env[67977]: DEBUG oslo_concurrency.lockutils [None req-07bf0755-8603-426c-8753-f4b529de9b64 tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 656.848s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.652825] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 457.153s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1399.653059] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Acquiring lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1399.653265] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1399.653430] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.656591] env[67977]: INFO nova.compute.manager [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Terminating instance [ 1399.657966] env[67977]: DEBUG nova.compute.manager [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1399.658170] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1399.658642] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0c6260ac-90d9-4e65-984d-272dd15c8450 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.665276] env[67977]: DEBUG nova.compute.manager [None req-3cc51610-cc38-48e9-a008-7268194d729f tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: a61121e3-0375-41ef-9dad-0469583640ab] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1399.671027] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-891a3095-fe26-45d2-9ee0-f5cae88bbf93 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.701788] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 48d09ae0-ab95-45e8-a916-ecf24abb66a0 could not be found. [ 1399.702092] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1399.702210] env[67977]: INFO nova.compute.manager [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1399.702430] env[67977]: DEBUG oslo.service.loopingcall [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1399.702875] env[67977]: DEBUG nova.compute.manager [None req-3cc51610-cc38-48e9-a008-7268194d729f tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: a61121e3-0375-41ef-9dad-0469583640ab] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1399.703748] env[67977]: DEBUG nova.compute.manager [-] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1399.703854] env[67977]: DEBUG nova.network.neutron [-] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1399.728670] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3cc51610-cc38-48e9-a008-7268194d729f tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "a61121e3-0375-41ef-9dad-0469583640ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.369s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.731061] env[67977]: DEBUG nova.network.neutron [-] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1399.737833] env[67977]: INFO nova.compute.manager [-] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] Took 0.03 seconds to deallocate network for instance. [ 1399.742964] env[67977]: DEBUG nova.compute.manager [None req-32f0bee9-eea2-408b-8367-8620b12489ae tempest-InstanceActionsV221TestJSON-1058234220 tempest-InstanceActionsV221TestJSON-1058234220-project-member] [instance: 8351701c-562c-4ec6-998a-a0b1a62f0c5e] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1399.766527] env[67977]: DEBUG nova.compute.manager [None req-32f0bee9-eea2-408b-8367-8620b12489ae tempest-InstanceActionsV221TestJSON-1058234220 tempest-InstanceActionsV221TestJSON-1058234220-project-member] [instance: 8351701c-562c-4ec6-998a-a0b1a62f0c5e] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1399.799094] env[67977]: DEBUG oslo_concurrency.lockutils [None req-32f0bee9-eea2-408b-8367-8620b12489ae tempest-InstanceActionsV221TestJSON-1058234220 tempest-InstanceActionsV221TestJSON-1058234220-project-member] Lock "8351701c-562c-4ec6-998a-a0b1a62f0c5e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.061s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.809062] env[67977]: DEBUG nova.compute.manager [None req-564df29f-d093-4b9c-ae99-4c262f9e3e06 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] [instance: ea5d10a1-2beb-4997-beff-e6a0e24e5b9d] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1399.841463] env[67977]: DEBUG nova.compute.manager [None req-564df29f-d093-4b9c-ae99-4c262f9e3e06 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] [instance: ea5d10a1-2beb-4997-beff-e6a0e24e5b9d] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1399.852647] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ae6abb61-3ab8-453a-8d6c-26b931ef74bb tempest-ListServersNegativeTestJSON-1979410733 tempest-ListServersNegativeTestJSON-1979410733-project-member] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.200s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.853436] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 101.128s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1399.853624] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 48d09ae0-ab95-45e8-a916-ecf24abb66a0] During sync_power_state the instance has a pending task (deleting). Skip. [ 1399.853799] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "48d09ae0-ab95-45e8-a916-ecf24abb66a0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.865249] env[67977]: DEBUG oslo_concurrency.lockutils [None req-564df29f-d093-4b9c-ae99-4c262f9e3e06 tempest-ServersTestMultiNic-1583074918 tempest-ServersTestMultiNic-1583074918-project-member] Lock "ea5d10a1-2beb-4997-beff-e6a0e24e5b9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.830s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.873801] env[67977]: DEBUG nova.compute.manager [None req-74451808-c718-429c-937d-4217647056e7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: 33412c29-7b03-421f-9502-4c2a639adfd7] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1399.898146] env[67977]: DEBUG nova.compute.manager [None req-74451808-c718-429c-937d-4217647056e7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: 33412c29-7b03-421f-9502-4c2a639adfd7] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1399.926041] env[67977]: DEBUG oslo_concurrency.lockutils [None req-74451808-c718-429c-937d-4217647056e7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "33412c29-7b03-421f-9502-4c2a639adfd7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.806s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.940017] env[67977]: DEBUG nova.compute.manager [None req-e317a14c-c2b2-4fb0-a2c6-1b92546f7f7d tempest-ServersNegativeTestJSON-1846221724 tempest-ServersNegativeTestJSON-1846221724-project-member] [instance: 508bc9c7-8a3b-4ac8-9eda-b2d3f5916997] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1399.965295] env[67977]: DEBUG nova.compute.manager [None req-e317a14c-c2b2-4fb0-a2c6-1b92546f7f7d tempest-ServersNegativeTestJSON-1846221724 tempest-ServersNegativeTestJSON-1846221724-project-member] [instance: 508bc9c7-8a3b-4ac8-9eda-b2d3f5916997] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1399.986789] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e317a14c-c2b2-4fb0-a2c6-1b92546f7f7d tempest-ServersNegativeTestJSON-1846221724 tempest-ServersNegativeTestJSON-1846221724-project-member] Lock "508bc9c7-8a3b-4ac8-9eda-b2d3f5916997" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.076s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.995759] env[67977]: DEBUG nova.compute.manager [None req-53042408-bbc4-4faf-9d42-414588ce70e6 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 8d6894ae-ecbf-4ad2-ae91-463c88ef5de3] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1400.019214] env[67977]: DEBUG nova.compute.manager [None req-53042408-bbc4-4faf-9d42-414588ce70e6 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 8d6894ae-ecbf-4ad2-ae91-463c88ef5de3] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1400.042709] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53042408-bbc4-4faf-9d42-414588ce70e6 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "8d6894ae-ecbf-4ad2-ae91-463c88ef5de3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.592s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1400.052210] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1400.107196] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1400.107453] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1400.108934] env[67977]: INFO nova.compute.claims [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1400.321927] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4179758-fe18-4b0f-b60e-178e92cad133 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.329768] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-623c1790-4fad-4d24-a44f-958a21772457 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.359594] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a505d965-34b1-466b-a353-1aafccdb880f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.366892] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e552767d-4cff-477d-a6bb-a63084b9cb9e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.381607] env[67977]: DEBUG nova.compute.provider_tree [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1400.391199] env[67977]: DEBUG nova.scheduler.client.report [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1400.405934] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.298s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1400.406761] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1400.438336] env[67977]: DEBUG nova.compute.utils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1400.440529] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1400.440529] env[67977]: DEBUG nova.network.neutron [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1400.448591] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1400.513040] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1400.536400] env[67977]: DEBUG nova.policy [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd35039d87f274119a281d2836618862b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '629b2265a2eb45128d27cb16a9e0304b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1400.539595] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1400.539815] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1400.539970] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1400.540171] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1400.540365] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1400.540529] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1400.540737] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1400.540923] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1400.541072] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1400.541238] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1400.541410] env[67977]: DEBUG nova.virt.hardware [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1400.542273] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c786136a-8daf-4866-9cdf-396159f1e610 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.550428] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c93bc685-1ac1-4519-bb61-2555862967d2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.839215] env[67977]: DEBUG nova.network.neutron [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Successfully created port: a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1401.281886] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1401.282084] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1401.282213] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1401.304111] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.304270] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.304402] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.304537] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.304659] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.304900] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.304900] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.305014] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.305125] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.305240] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1401.305358] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1401.517330] env[67977]: DEBUG nova.network.neutron [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Successfully updated port: a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1401.528433] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "refresh_cache-e77a441b-952b-42c0-907f-e30888e505a8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1401.528542] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired lock "refresh_cache-e77a441b-952b-42c0-907f-e30888e505a8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1401.528686] env[67977]: DEBUG nova.network.neutron [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1401.564420] env[67977]: DEBUG nova.network.neutron [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1401.652432] env[67977]: DEBUG nova.compute.manager [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Received event network-vif-plugged-a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1401.652654] env[67977]: DEBUG oslo_concurrency.lockutils [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] Acquiring lock "e77a441b-952b-42c0-907f-e30888e505a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1401.652862] env[67977]: DEBUG oslo_concurrency.lockutils [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] Lock "e77a441b-952b-42c0-907f-e30888e505a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.653050] env[67977]: DEBUG oslo_concurrency.lockutils [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] Lock "e77a441b-952b-42c0-907f-e30888e505a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.653224] env[67977]: DEBUG nova.compute.manager [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] [instance: e77a441b-952b-42c0-907f-e30888e505a8] No waiting events found dispatching network-vif-plugged-a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1401.653413] env[67977]: WARNING nova.compute.manager [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Received unexpected event network-vif-plugged-a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c for instance with vm_state building and task_state spawning. [ 1401.653550] env[67977]: DEBUG nova.compute.manager [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Received event network-changed-a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1401.653748] env[67977]: DEBUG nova.compute.manager [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Refreshing instance network info cache due to event network-changed-a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1401.653875] env[67977]: DEBUG oslo_concurrency.lockutils [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] Acquiring lock "refresh_cache-e77a441b-952b-42c0-907f-e30888e505a8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1401.722164] env[67977]: DEBUG nova.network.neutron [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Updating instance_info_cache with network_info: [{"id": "a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c", "address": "fa:16:3e:24:9c:88", "network": {"id": "9f9704ef-f97e-4049-b46f-6c90efa33e6e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-765087939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "629b2265a2eb45128d27cb16a9e0304b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa63fcd71-3b", "ovs_interfaceid": "a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1401.734871] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Releasing lock "refresh_cache-e77a441b-952b-42c0-907f-e30888e505a8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1401.735160] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Instance network_info: |[{"id": "a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c", "address": "fa:16:3e:24:9c:88", "network": {"id": "9f9704ef-f97e-4049-b46f-6c90efa33e6e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-765087939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "629b2265a2eb45128d27cb16a9e0304b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa63fcd71-3b", "ovs_interfaceid": "a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1401.735451] env[67977]: DEBUG oslo_concurrency.lockutils [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] Acquired lock "refresh_cache-e77a441b-952b-42c0-907f-e30888e505a8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1401.735629] env[67977]: DEBUG nova.network.neutron [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Refreshing network info cache for port a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1401.736605] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:9c:88', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '89ef02af-c508-432f-ae29-3a219701d584', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1401.744201] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating folder: Project (629b2265a2eb45128d27cb16a9e0304b). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1401.745025] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1addf1de-e27f-49d6-a762-164c0b943ac3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.757755] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Created folder: Project (629b2265a2eb45128d27cb16a9e0304b) in parent group-v693022. [ 1401.757938] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating folder: Instances. Parent ref: group-v693096. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1401.758170] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a9912f7e-572d-4f55-83ab-894cd86eae0c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.766675] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Created folder: Instances in parent group-v693096. [ 1401.766894] env[67977]: DEBUG oslo.service.loopingcall [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1401.767081] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1401.767270] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5adc3c8b-c95b-46cc-ae02-5d2b79a44e04 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.789105] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1401.789105] env[67977]: value = "task-3468222" [ 1401.789105] env[67977]: _type = "Task" [ 1401.789105] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1401.799054] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468222, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1402.019740] env[67977]: DEBUG nova.network.neutron [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Updated VIF entry in instance network info cache for port a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1402.020119] env[67977]: DEBUG nova.network.neutron [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Updating instance_info_cache with network_info: [{"id": "a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c", "address": "fa:16:3e:24:9c:88", "network": {"id": "9f9704ef-f97e-4049-b46f-6c90efa33e6e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-765087939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "629b2265a2eb45128d27cb16a9e0304b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa63fcd71-3b", "ovs_interfaceid": "a63fcd71-3bea-41ce-8fbf-9a3ffed2a17c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1402.030199] env[67977]: DEBUG oslo_concurrency.lockutils [req-9bf071d3-8607-45f5-8af5-1aa0003e50fa req-7f8f243b-214f-421f-bd1f-d30ed47bde6b service nova] Releasing lock "refresh_cache-e77a441b-952b-42c0-907f-e30888e505a8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1402.299121] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468222, 'name': CreateVM_Task, 'duration_secs': 0.283996} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1402.299306] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1402.300009] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1402.300191] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1402.300598] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1402.300834] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c383ef41-f2cc-40e7-bacb-7edcfd35563e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.305325] env[67977]: DEBUG oslo_vmware.api [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 1402.305325] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52c50c72-35c2-c3dc-a286-0973cd671aa9" [ 1402.305325] env[67977]: _type = "Task" [ 1402.305325] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1402.312885] env[67977]: DEBUG oslo_vmware.api [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52c50c72-35c2-c3dc-a286-0973cd671aa9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1402.815093] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1402.815378] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1402.815596] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1411.358416] env[67977]: DEBUG oslo_concurrency.lockutils [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "e77a441b-952b-42c0-907f-e30888e505a8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1445.089854] env[67977]: WARNING oslo_vmware.rw_handles [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1445.089854] env[67977]: ERROR oslo_vmware.rw_handles [ 1445.090492] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/f0cd65c5-0b47-4f12-82be-da4fe87c0ea0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1445.092424] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1445.092653] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Copying Virtual Disk [datastore1] vmware_temp/f0cd65c5-0b47-4f12-82be-da4fe87c0ea0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/f0cd65c5-0b47-4f12-82be-da4fe87c0ea0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1445.092931] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a47236a7-0afd-4574-8ef2-613ad28aeede {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.101260] env[67977]: DEBUG oslo_vmware.api [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 1445.101260] env[67977]: value = "task-3468223" [ 1445.101260] env[67977]: _type = "Task" [ 1445.101260] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1445.109302] env[67977]: DEBUG oslo_vmware.api [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468223, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1445.611144] env[67977]: DEBUG oslo_vmware.exceptions [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1445.611439] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1445.611985] env[67977]: ERROR nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1445.611985] env[67977]: Faults: ['InvalidArgument'] [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Traceback (most recent call last): [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] yield resources [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] self.driver.spawn(context, instance, image_meta, [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] self._fetch_image_if_missing(context, vi) [ 1445.611985] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] image_cache(vi, tmp_image_ds_loc) [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] vm_util.copy_virtual_disk( [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] session._wait_for_task(vmdk_copy_task) [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] return self.wait_for_task(task_ref) [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] return evt.wait() [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] result = hub.switch() [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1445.612362] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] return self.greenlet.switch() [ 1445.612718] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1445.612718] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] self.f(*self.args, **self.kw) [ 1445.612718] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1445.612718] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] raise exceptions.translate_fault(task_info.error) [ 1445.612718] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1445.612718] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Faults: ['InvalidArgument'] [ 1445.612718] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] [ 1445.612718] env[67977]: INFO nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Terminating instance [ 1445.613820] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1445.614080] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1445.614294] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-60e22600-9d29-4ab3-8c62-7ef7686e2eb8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.617326] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1445.617518] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1445.618244] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4319f8bc-1550-48e8-8cd4-5ed24ff714e9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.624841] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1445.625414] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9923640d-a054-4332-8f5e-89d5d5a1e932 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.627113] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1445.627296] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1445.628269] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a3a92d2d-a022-4fd5-a2cb-2ce5034d9c7a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.632991] env[67977]: DEBUG oslo_vmware.api [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Waiting for the task: (returnval){ [ 1445.632991] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5229989b-9b23-db0d-1be9-93964a9de349" [ 1445.632991] env[67977]: _type = "Task" [ 1445.632991] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1445.641722] env[67977]: DEBUG oslo_vmware.api [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5229989b-9b23-db0d-1be9-93964a9de349, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1445.695983] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1445.696231] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1445.696417] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleting the datastore file [datastore1] d7719b11-cef7-4878-a693-24dcd085a1d7 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1445.696678] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2bcb1287-14ff-4c34-89d7-e0d84627e614 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.703065] env[67977]: DEBUG oslo_vmware.api [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 1445.703065] env[67977]: value = "task-3468225" [ 1445.703065] env[67977]: _type = "Task" [ 1445.703065] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1445.710419] env[67977]: DEBUG oslo_vmware.api [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468225, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1446.142432] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1446.142705] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Creating directory with path [datastore1] vmware_temp/53b51574-de75-4597-b2e5-99a43cd8a894/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1446.142875] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-300052fb-84d1-450e-8558-d69d8f53f958 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.154939] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Created directory with path [datastore1] vmware_temp/53b51574-de75-4597-b2e5-99a43cd8a894/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1446.154939] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Fetch image to [datastore1] vmware_temp/53b51574-de75-4597-b2e5-99a43cd8a894/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1446.154939] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/53b51574-de75-4597-b2e5-99a43cd8a894/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1446.155475] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ef94523-9da6-4f74-84ea-fafda77b9e29 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.163340] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-493b3442-c417-4818-828c-57a35143b5e8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.176081] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d9b4671-b0be-4270-890c-524196c3d781 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.209250] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e39583ed-b2e1-47a2-a472-52e83e71d1fb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.217691] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c146a4da-0964-4b6f-a744-548b4f4cc793 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.219370] env[67977]: DEBUG oslo_vmware.api [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468225, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066122} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1446.219608] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1446.219790] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1446.219958] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1446.220149] env[67977]: INFO nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1446.222563] env[67977]: DEBUG nova.compute.claims [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1446.222733] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1446.222950] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1446.239762] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1446.411759] env[67977]: DEBUG oslo_vmware.rw_handles [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/53b51574-de75-4597-b2e5-99a43cd8a894/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1446.472491] env[67977]: DEBUG oslo_vmware.rw_handles [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1446.472675] env[67977]: DEBUG oslo_vmware.rw_handles [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/53b51574-de75-4597-b2e5-99a43cd8a894/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1446.481103] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45b0b0c9-b70f-488c-87e8-6837d1855898 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.488619] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4374af27-7a5b-4cab-9b82-409db3528110 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.169736] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a54049d-046c-4a5d-81a0-dc6f123e72a4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.177379] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30855e1a-154e-4fe8-885a-7ff917dbd1b4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.189778] env[67977]: DEBUG nova.compute.provider_tree [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1447.197788] env[67977]: DEBUG nova.scheduler.client.report [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1447.215858] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.993s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.216402] env[67977]: ERROR nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1447.216402] env[67977]: Faults: ['InvalidArgument'] [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Traceback (most recent call last): [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] self.driver.spawn(context, instance, image_meta, [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] self._fetch_image_if_missing(context, vi) [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] image_cache(vi, tmp_image_ds_loc) [ 1447.216402] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] vm_util.copy_virtual_disk( [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] session._wait_for_task(vmdk_copy_task) [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] return self.wait_for_task(task_ref) [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] return evt.wait() [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] result = hub.switch() [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] return self.greenlet.switch() [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1447.216739] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] self.f(*self.args, **self.kw) [ 1447.217075] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1447.217075] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] raise exceptions.translate_fault(task_info.error) [ 1447.217075] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1447.217075] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Faults: ['InvalidArgument'] [ 1447.217075] env[67977]: ERROR nova.compute.manager [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] [ 1447.217215] env[67977]: DEBUG nova.compute.utils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1447.218490] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Build of instance d7719b11-cef7-4878-a693-24dcd085a1d7 was re-scheduled: A specified parameter was not correct: fileType [ 1447.218490] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1447.218834] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1447.219037] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1447.219219] env[67977]: DEBUG nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1447.219380] env[67977]: DEBUG nova.network.neutron [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1447.506298] env[67977]: DEBUG nova.network.neutron [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1447.515625] env[67977]: INFO nova.compute.manager [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Took 0.30 seconds to deallocate network for instance. [ 1447.612209] env[67977]: INFO nova.scheduler.client.report [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleted allocations for instance d7719b11-cef7-4878-a693-24dcd085a1d7 [ 1447.635989] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c64b2321-bcd9-4fde-9cda-c4e7255f939e tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 681.449s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.636933] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 482.950s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1447.637225] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "d7719b11-cef7-4878-a693-24dcd085a1d7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1447.637421] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1447.638032] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.639537] env[67977]: INFO nova.compute.manager [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Terminating instance [ 1447.640994] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "refresh_cache-d7719b11-cef7-4878-a693-24dcd085a1d7" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1447.641995] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "refresh_cache-d7719b11-cef7-4878-a693-24dcd085a1d7" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1447.641995] env[67977]: DEBUG nova.network.neutron [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1447.648628] env[67977]: DEBUG nova.compute.manager [None req-187eec75-b383-4799-9ace-f9e8ed512ff9 tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] [instance: 1e05e7be-d468-4908-a1e4-8a11064277b1] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1447.669318] env[67977]: DEBUG nova.network.neutron [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1447.675101] env[67977]: DEBUG nova.compute.manager [None req-187eec75-b383-4799-9ace-f9e8ed512ff9 tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] [instance: 1e05e7be-d468-4908-a1e4-8a11064277b1] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1447.697443] env[67977]: DEBUG oslo_concurrency.lockutils [None req-187eec75-b383-4799-9ace-f9e8ed512ff9 tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] Lock "1e05e7be-d468-4908-a1e4-8a11064277b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.259s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.707962] env[67977]: DEBUG nova.compute.manager [None req-e936d7e9-964b-4423-967d-fe4e52b2435c tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] [instance: ee368449-13fb-431d-9ae5-f8c08d777336] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1447.730429] env[67977]: DEBUG nova.compute.manager [None req-e936d7e9-964b-4423-967d-fe4e52b2435c tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] [instance: ee368449-13fb-431d-9ae5-f8c08d777336] Instance disappeared before build. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1447.750247] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e936d7e9-964b-4423-967d-fe4e52b2435c tempest-ListImageFiltersTestJSON-414035292 tempest-ListImageFiltersTestJSON-414035292-project-member] Lock "ee368449-13fb-431d-9ae5-f8c08d777336" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.118s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.762224] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1447.799878] env[67977]: DEBUG nova.network.neutron [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1447.810746] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "refresh_cache-d7719b11-cef7-4878-a693-24dcd085a1d7" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1447.811169] env[67977]: DEBUG nova.compute.manager [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1447.811377] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1447.811859] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-90abcb25-f7a6-46a6-b430-bd64917740b4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.816055] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1447.816287] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1447.817779] env[67977]: INFO nova.compute.claims [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1447.824096] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43af3bd6-b3b4-4c33-95de-f1456ed3782d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.852805] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d7719b11-cef7-4878-a693-24dcd085a1d7 could not be found. [ 1447.853034] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1447.853224] env[67977]: INFO nova.compute.manager [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1447.853486] env[67977]: DEBUG oslo.service.loopingcall [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1447.853701] env[67977]: DEBUG nova.compute.manager [-] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1447.853798] env[67977]: DEBUG nova.network.neutron [-] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1447.871354] env[67977]: DEBUG nova.network.neutron [-] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1447.879033] env[67977]: DEBUG nova.network.neutron [-] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1447.887029] env[67977]: INFO nova.compute.manager [-] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] Took 0.03 seconds to deallocate network for instance. [ 1447.991867] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f8bc2d60-6750-450f-a667-3409afdb52aa tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.355s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.993596] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 149.267s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1447.993596] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d7719b11-cef7-4878-a693-24dcd085a1d7] During sync_power_state the instance has a pending task (deleting). Skip. [ 1447.993596] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "d7719b11-cef7-4878-a693-24dcd085a1d7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1448.049550] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-764a9450-7957-4ed9-a9b0-db5b4aa4b933 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.057779] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d809581-0ece-4697-a58c-ad4ef5cb9e40 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.087920] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c77893f2-071b-4175-bd92-04e135b8667f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.094614] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c37c4af2-6aa2-46e0-b856-4e530aba771f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.107061] env[67977]: DEBUG nova.compute.provider_tree [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1448.115762] env[67977]: DEBUG nova.scheduler.client.report [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1448.130910] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.314s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1448.131525] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1448.164776] env[67977]: DEBUG nova.compute.utils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1448.166311] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1448.166311] env[67977]: DEBUG nova.network.neutron [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1448.177043] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1448.226117] env[67977]: DEBUG nova.policy [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2dbe42dbff0147799c65ecddff92a251', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80a319585ad347c7b61cf30ed69b5023', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1448.249437] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1448.277154] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1448.277405] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1448.277655] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1448.277864] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1448.278023] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1448.278177] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1448.278388] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1448.278547] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1448.278711] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1448.278871] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1448.279044] env[67977]: DEBUG nova.virt.hardware [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1448.279967] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6426fb9-08e8-4e49-8ea0-ae4af1f63d08 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.288246] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78c9d423-c187-49ae-aa4a-31c295a703fb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.561570] env[67977]: DEBUG nova.network.neutron [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Successfully created port: 11811a54-293c-4069-82d6-63e0fed9846c {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1449.201709] env[67977]: DEBUG nova.network.neutron [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Successfully updated port: 11811a54-293c-4069-82d6-63e0fed9846c {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1449.213063] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquiring lock "refresh_cache-1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1449.213221] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquired lock "refresh_cache-1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1449.213370] env[67977]: DEBUG nova.network.neutron [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1449.253391] env[67977]: DEBUG nova.network.neutron [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1449.419186] env[67977]: DEBUG nova.network.neutron [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Updating instance_info_cache with network_info: [{"id": "11811a54-293c-4069-82d6-63e0fed9846c", "address": "fa:16:3e:65:09:f6", "network": {"id": "a9022aa9-602d-471f-9a12-48dd15cdb0f3", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-288932125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80a319585ad347c7b61cf30ed69b5023", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd98a25d-a7a9-4fb5-8fef-e8df4dbbbf11", "external-id": "nsx-vlan-transportzone-707", "segmentation_id": 707, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11811a54-29", "ovs_interfaceid": "11811a54-293c-4069-82d6-63e0fed9846c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1449.431885] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Releasing lock "refresh_cache-1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1449.432278] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Instance network_info: |[{"id": "11811a54-293c-4069-82d6-63e0fed9846c", "address": "fa:16:3e:65:09:f6", "network": {"id": "a9022aa9-602d-471f-9a12-48dd15cdb0f3", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-288932125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80a319585ad347c7b61cf30ed69b5023", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd98a25d-a7a9-4fb5-8fef-e8df4dbbbf11", "external-id": "nsx-vlan-transportzone-707", "segmentation_id": 707, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11811a54-29", "ovs_interfaceid": "11811a54-293c-4069-82d6-63e0fed9846c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1449.432697] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:65:09:f6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dd98a25d-a7a9-4fb5-8fef-e8df4dbbbf11', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '11811a54-293c-4069-82d6-63e0fed9846c', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1449.440527] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Creating folder: Project (80a319585ad347c7b61cf30ed69b5023). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1449.441089] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-61955b68-41ad-49e7-ae4c-0394c52317ec {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.452485] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Created folder: Project (80a319585ad347c7b61cf30ed69b5023) in parent group-v693022. [ 1449.452676] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Creating folder: Instances. Parent ref: group-v693099. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1449.453207] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ef257bc3-cbb9-4f6c-805b-adb9e28ddbcf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.462979] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Created folder: Instances in parent group-v693099. [ 1449.463233] env[67977]: DEBUG oslo.service.loopingcall [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1449.463422] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1449.463618] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-98ffe07d-ce80-45c7-8acb-6bcaf34bec4d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.482682] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1449.482682] env[67977]: value = "task-3468228" [ 1449.482682] env[67977]: _type = "Task" [ 1449.482682] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1449.490093] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468228, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1449.542077] env[67977]: DEBUG nova.compute.manager [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Received event network-vif-plugged-11811a54-293c-4069-82d6-63e0fed9846c {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1449.542319] env[67977]: DEBUG oslo_concurrency.lockutils [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] Acquiring lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1449.542532] env[67977]: DEBUG oslo_concurrency.lockutils [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] Lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1449.542698] env[67977]: DEBUG oslo_concurrency.lockutils [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] Lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1449.542867] env[67977]: DEBUG nova.compute.manager [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] No waiting events found dispatching network-vif-plugged-11811a54-293c-4069-82d6-63e0fed9846c {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1449.543056] env[67977]: WARNING nova.compute.manager [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Received unexpected event network-vif-plugged-11811a54-293c-4069-82d6-63e0fed9846c for instance with vm_state building and task_state spawning. [ 1449.543190] env[67977]: DEBUG nova.compute.manager [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Received event network-changed-11811a54-293c-4069-82d6-63e0fed9846c {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1449.543364] env[67977]: DEBUG nova.compute.manager [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Refreshing instance network info cache due to event network-changed-11811a54-293c-4069-82d6-63e0fed9846c. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1449.543564] env[67977]: DEBUG oslo_concurrency.lockutils [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] Acquiring lock "refresh_cache-1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1449.543721] env[67977]: DEBUG oslo_concurrency.lockutils [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] Acquired lock "refresh_cache-1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1449.543889] env[67977]: DEBUG nova.network.neutron [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Refreshing network info cache for port 11811a54-293c-4069-82d6-63e0fed9846c {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1449.790132] env[67977]: DEBUG nova.network.neutron [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Updated VIF entry in instance network info cache for port 11811a54-293c-4069-82d6-63e0fed9846c. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1449.790509] env[67977]: DEBUG nova.network.neutron [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Updating instance_info_cache with network_info: [{"id": "11811a54-293c-4069-82d6-63e0fed9846c", "address": "fa:16:3e:65:09:f6", "network": {"id": "a9022aa9-602d-471f-9a12-48dd15cdb0f3", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-288932125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80a319585ad347c7b61cf30ed69b5023", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd98a25d-a7a9-4fb5-8fef-e8df4dbbbf11", "external-id": "nsx-vlan-transportzone-707", "segmentation_id": 707, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11811a54-29", "ovs_interfaceid": "11811a54-293c-4069-82d6-63e0fed9846c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1449.799630] env[67977]: DEBUG oslo_concurrency.lockutils [req-e48dbbb4-96e4-43d8-a8bd-42715a678913 req-421378d1-296a-4884-9937-32e444895fce service nova] Releasing lock "refresh_cache-1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1449.992617] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468228, 'name': CreateVM_Task} progress is 99%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1450.492558] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468228, 'name': CreateVM_Task} progress is 99%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1450.993934] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468228, 'name': CreateVM_Task, 'duration_secs': 1.291323} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1450.994126] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1450.994819] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1450.994986] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1450.995322] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1450.995568] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e4a1805a-355c-4919-aa25-e2170920f38f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.999744] env[67977]: DEBUG oslo_vmware.api [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Waiting for the task: (returnval){ [ 1450.999744] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d87c53-4b45-7dbf-99bd-5839f9f3d013" [ 1450.999744] env[67977]: _type = "Task" [ 1450.999744] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1451.006731] env[67977]: DEBUG oslo_vmware.api [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d87c53-4b45-7dbf-99bd-5839f9f3d013, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1451.510517] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1451.510821] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1451.510977] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1453.775022] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1453.775344] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1454.775588] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1454.775858] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1455.774635] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1455.774881] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1456.770778] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1456.774481] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1456.774674] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1456.786428] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1456.786634] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1456.786796] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1456.786948] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1456.788018] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e166f44-ed36-4479-a0a8-aa01920f50e1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.796580] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1902d5a-9187-407a-9684-ab1f2ab09bdc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.811032] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2c8307b-70f0-4d82-8b42-deb88467ee4b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.816999] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1f7734c-09c9-480f-8e14-193a9866515d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.845106] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180934MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1456.845261] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1456.845452] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1456.920024] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.920024] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.920166] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.920263] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.920414] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.920586] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.920644] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.920758] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.920888] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.921015] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1456.932162] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6fae5126-6618-4337-9a52-d6019727e0b0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1456.942525] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1456.954334] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1456.954617] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1456.954771] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1457.105227] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5e113c7-a243-4492-9c31-31099b32d52d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1457.112710] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cf2de89-ebe1-4da7-b67d-d189899ded3f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1457.142666] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2c98ab3-f0e5-47e0-9ae6-5537fcf1ef8f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1457.149130] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69a94c26-bdc7-4407-9533-a07af4d1bcad {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1457.161935] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1457.170708] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1457.185025] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1457.185229] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.340s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1461.186176] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1461.186480] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1461.186480] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1461.206866] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.207055] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.207147] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.207371] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.207501] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.207623] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.207743] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.207861] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.207979] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.208115] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1461.208225] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1463.411933] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1463.412279] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1467.210816] env[67977]: DEBUG oslo_concurrency.lockutils [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquiring lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1494.255182] env[67977]: WARNING oslo_vmware.rw_handles [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1494.255182] env[67977]: ERROR oslo_vmware.rw_handles [ 1494.255720] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/53b51574-de75-4597-b2e5-99a43cd8a894/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1494.257906] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1494.258188] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Copying Virtual Disk [datastore1] vmware_temp/53b51574-de75-4597-b2e5-99a43cd8a894/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/53b51574-de75-4597-b2e5-99a43cd8a894/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1494.258468] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-224868e5-ed74-4a11-987d-92d2a95bd270 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.266526] env[67977]: DEBUG oslo_vmware.api [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Waiting for the task: (returnval){ [ 1494.266526] env[67977]: value = "task-3468229" [ 1494.266526] env[67977]: _type = "Task" [ 1494.266526] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1494.275042] env[67977]: DEBUG oslo_vmware.api [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Task: {'id': task-3468229, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1494.776779] env[67977]: DEBUG oslo_vmware.exceptions [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1494.777070] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1494.777620] env[67977]: ERROR nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1494.777620] env[67977]: Faults: ['InvalidArgument'] [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Traceback (most recent call last): [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] yield resources [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] self.driver.spawn(context, instance, image_meta, [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] self._fetch_image_if_missing(context, vi) [ 1494.777620] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] image_cache(vi, tmp_image_ds_loc) [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] vm_util.copy_virtual_disk( [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] session._wait_for_task(vmdk_copy_task) [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] return self.wait_for_task(task_ref) [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] return evt.wait() [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] result = hub.switch() [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1494.777967] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] return self.greenlet.switch() [ 1494.778275] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1494.778275] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] self.f(*self.args, **self.kw) [ 1494.778275] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1494.778275] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] raise exceptions.translate_fault(task_info.error) [ 1494.778275] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1494.778275] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Faults: ['InvalidArgument'] [ 1494.778275] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] [ 1494.778275] env[67977]: INFO nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Terminating instance [ 1494.779454] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1494.779662] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1494.779891] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a63cc15e-9662-4a7a-834d-80b6ac954603 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.782226] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1494.782417] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1494.783186] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55d06cb9-3d0a-4f10-bf76-930b735851c9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.789573] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1494.789778] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4f894e03-b23a-4670-9321-3ac7bdcbce81 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.791837] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1494.792040] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1494.792945] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-285b3633-83cd-4e41-b824-7d15c386574d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.797419] env[67977]: DEBUG oslo_vmware.api [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Waiting for the task: (returnval){ [ 1494.797419] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]524994ff-5a70-1703-f53f-1a9c56e90a43" [ 1494.797419] env[67977]: _type = "Task" [ 1494.797419] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1494.804141] env[67977]: DEBUG oslo_vmware.api [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]524994ff-5a70-1703-f53f-1a9c56e90a43, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1494.866197] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1494.866459] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1494.866662] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Deleting the datastore file [datastore1] 6e2f1b5e-7bdc-463d-9822-810f99b81623 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1494.866940] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f219c95d-d2c0-4bb7-8fa9-9c38dd136602 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.872786] env[67977]: DEBUG oslo_vmware.api [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Waiting for the task: (returnval){ [ 1494.872786] env[67977]: value = "task-3468231" [ 1494.872786] env[67977]: _type = "Task" [ 1494.872786] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1494.879933] env[67977]: DEBUG oslo_vmware.api [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Task: {'id': task-3468231, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1495.307904] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1495.308233] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Creating directory with path [datastore1] vmware_temp/98294a6f-4379-44dd-b9a5-4a5655330974/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1495.308461] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5ceeab07-41d9-48ee-8698-41cc8bbd8448 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.319223] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Created directory with path [datastore1] vmware_temp/98294a6f-4379-44dd-b9a5-4a5655330974/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1495.319406] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Fetch image to [datastore1] vmware_temp/98294a6f-4379-44dd-b9a5-4a5655330974/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1495.319575] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/98294a6f-4379-44dd-b9a5-4a5655330974/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1495.320311] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-147b7b1e-9eda-4afb-a3ef-c1a4742d6835 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.326743] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc8ddb39-0910-49da-a7b4-0a24f4ba70f8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.335573] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b214b2c-ab46-45c0-9e2e-2ce8aea0101c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.364847] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aa5739c-1810-40ac-8033-264fdc68d496 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.370045] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-11d1242f-106a-4a7c-8db7-f941d3e5b6b9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.381995] env[67977]: DEBUG oslo_vmware.api [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Task: {'id': task-3468231, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077092} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1495.382267] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1495.382482] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1495.382676] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1495.382912] env[67977]: INFO nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1495.385067] env[67977]: DEBUG nova.compute.claims [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1495.385285] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1495.385532] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1495.392962] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1495.510753] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/98294a6f-4379-44dd-b9a5-4a5655330974/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1495.571774] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1495.572014] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/98294a6f-4379-44dd-b9a5-4a5655330974/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1495.647014] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffda07c1-fe17-412f-b836-c43e4a689820 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.654458] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60d9786e-bd53-44c0-812c-b0c34b3a1bbc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.683371] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e865571f-3d9c-4f6f-875f-069cc80afc17 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.689930] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51e313e7-8b39-47aa-956e-55120bd5d80a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.703103] env[67977]: DEBUG nova.compute.provider_tree [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1495.711624] env[67977]: DEBUG nova.scheduler.client.report [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1495.727426] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.342s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1495.727942] env[67977]: ERROR nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1495.727942] env[67977]: Faults: ['InvalidArgument'] [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Traceback (most recent call last): [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] self.driver.spawn(context, instance, image_meta, [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] self._fetch_image_if_missing(context, vi) [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] image_cache(vi, tmp_image_ds_loc) [ 1495.727942] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] vm_util.copy_virtual_disk( [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] session._wait_for_task(vmdk_copy_task) [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] return self.wait_for_task(task_ref) [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] return evt.wait() [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] result = hub.switch() [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] return self.greenlet.switch() [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1495.728430] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] self.f(*self.args, **self.kw) [ 1495.728716] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1495.728716] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] raise exceptions.translate_fault(task_info.error) [ 1495.728716] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1495.728716] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Faults: ['InvalidArgument'] [ 1495.728716] env[67977]: ERROR nova.compute.manager [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] [ 1495.728716] env[67977]: DEBUG nova.compute.utils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1495.729929] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Build of instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 was re-scheduled: A specified parameter was not correct: fileType [ 1495.729929] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1495.730341] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1495.730521] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1495.730692] env[67977]: DEBUG nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1495.730853] env[67977]: DEBUG nova.network.neutron [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1496.105760] env[67977]: DEBUG nova.network.neutron [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1496.119540] env[67977]: INFO nova.compute.manager [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Took 0.39 seconds to deallocate network for instance. [ 1496.213191] env[67977]: INFO nova.scheduler.client.report [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Deleted allocations for instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 [ 1496.233279] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b4910db8-002d-407e-88f8-db4f691608e8 tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 674.619s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.235180] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 477.721s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1496.235180] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Acquiring lock "6e2f1b5e-7bdc-463d-9822-810f99b81623-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1496.235180] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1496.235395] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.239268] env[67977]: INFO nova.compute.manager [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Terminating instance [ 1496.240548] env[67977]: DEBUG nova.compute.manager [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1496.241192] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1496.241192] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-508fdfea-1f64-4f8d-94aa-fd3aac4c7657 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.246581] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1496.253451] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75dd1f87-c299-495e-b6c0-331d331bc650 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.283860] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6e2f1b5e-7bdc-463d-9822-810f99b81623 could not be found. [ 1496.284080] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1496.284261] env[67977]: INFO nova.compute.manager [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1496.284503] env[67977]: DEBUG oslo.service.loopingcall [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1496.286692] env[67977]: DEBUG nova.compute.manager [-] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1496.286784] env[67977]: DEBUG nova.network.neutron [-] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1496.300712] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1496.300960] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1496.302449] env[67977]: INFO nova.compute.claims [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1496.312385] env[67977]: DEBUG nova.network.neutron [-] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1496.323114] env[67977]: INFO nova.compute.manager [-] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] Took 0.04 seconds to deallocate network for instance. [ 1496.415444] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8bd0f4e4-efee-4e2d-92ce-602dbcf29cac tempest-ImagesNegativeTestJSON-1994938285 tempest-ImagesNegativeTestJSON-1994938285-project-member] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.416166] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 197.690s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1496.416358] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6e2f1b5e-7bdc-463d-9822-810f99b81623] During sync_power_state the instance has a pending task (deleting). Skip. [ 1496.417043] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "6e2f1b5e-7bdc-463d-9822-810f99b81623" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.535253] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba9d8346-4952-46f1-b8d0-35b04fc3e119 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.542876] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cdd4529-e9ca-4a86-8565-a109b7df0def {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.572996] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57a44257-c92f-447d-93fc-c174c1db502f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.580169] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1808432f-3481-4bec-af9e-8955663fee65 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.593326] env[67977]: DEBUG nova.compute.provider_tree [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1496.601999] env[67977]: DEBUG nova.scheduler.client.report [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1496.617902] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.618378] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1496.649114] env[67977]: DEBUG nova.compute.utils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1496.650800] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1496.650800] env[67977]: DEBUG nova.network.neutron [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1496.660881] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1496.725476] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1496.743137] env[67977]: DEBUG nova.policy [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd76b3cc7fe2143dabe6ab02906a25097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '52e6b27298274fa1a10d95d9a967814b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1496.753336] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1496.753595] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1496.753760] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1496.753942] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1496.754105] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1496.754255] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1496.754459] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1496.754616] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1496.754784] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1496.754946] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1496.755267] env[67977]: DEBUG nova.virt.hardware [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1496.756140] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1017ac98-1872-4964-8ecd-35c1ad600ec6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.764415] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32933534-2fd8-41b6-81a4-9b9e3f1316d7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.065710] env[67977]: DEBUG nova.network.neutron [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Successfully created port: 3664995e-eabb-4e28-ac81-bf3c252e91df {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1497.685515] env[67977]: DEBUG nova.network.neutron [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Successfully updated port: 3664995e-eabb-4e28-ac81-bf3c252e91df {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1497.700666] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "refresh_cache-6fae5126-6618-4337-9a52-d6019727e0b0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1497.700666] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired lock "refresh_cache-6fae5126-6618-4337-9a52-d6019727e0b0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1497.700666] env[67977]: DEBUG nova.network.neutron [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1497.746067] env[67977]: DEBUG nova.network.neutron [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1497.907688] env[67977]: DEBUG nova.network.neutron [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Updating instance_info_cache with network_info: [{"id": "3664995e-eabb-4e28-ac81-bf3c252e91df", "address": "fa:16:3e:f0:dd:b8", "network": {"id": "4eece86d-f584-415a-bcdd-5bde739283be", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1877932954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "52e6b27298274fa1a10d95d9a967814b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "975b168a-03e5-449d-95ac-4d51ba027242", "external-id": "nsx-vlan-transportzone-365", "segmentation_id": 365, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3664995e-ea", "ovs_interfaceid": "3664995e-eabb-4e28-ac81-bf3c252e91df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1497.919096] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Releasing lock "refresh_cache-6fae5126-6618-4337-9a52-d6019727e0b0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1497.919420] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Instance network_info: |[{"id": "3664995e-eabb-4e28-ac81-bf3c252e91df", "address": "fa:16:3e:f0:dd:b8", "network": {"id": "4eece86d-f584-415a-bcdd-5bde739283be", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1877932954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "52e6b27298274fa1a10d95d9a967814b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "975b168a-03e5-449d-95ac-4d51ba027242", "external-id": "nsx-vlan-transportzone-365", "segmentation_id": 365, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3664995e-ea", "ovs_interfaceid": "3664995e-eabb-4e28-ac81-bf3c252e91df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1497.919843] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f0:dd:b8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '975b168a-03e5-449d-95ac-4d51ba027242', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3664995e-eabb-4e28-ac81-bf3c252e91df', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1497.927901] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating folder: Project (52e6b27298274fa1a10d95d9a967814b). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1497.928470] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-27a4b9d6-80b0-4635-a6e0-b1dc00d94d51 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.941040] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Created folder: Project (52e6b27298274fa1a10d95d9a967814b) in parent group-v693022. [ 1497.941328] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating folder: Instances. Parent ref: group-v693102. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1497.941597] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ba94d178-23a5-42a7-9f10-483c52ea5bfd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.951418] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Created folder: Instances in parent group-v693102. [ 1497.951564] env[67977]: DEBUG oslo.service.loopingcall [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1497.951748] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1497.951960] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8b27010e-4821-4b31-8ea8-610a3c200b88 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.973191] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1497.973191] env[67977]: value = "task-3468234" [ 1497.973191] env[67977]: _type = "Task" [ 1497.973191] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1497.981118] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468234, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1498.148471] env[67977]: DEBUG nova.compute.manager [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Received event network-vif-plugged-3664995e-eabb-4e28-ac81-bf3c252e91df {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1498.148696] env[67977]: DEBUG oslo_concurrency.lockutils [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] Acquiring lock "6fae5126-6618-4337-9a52-d6019727e0b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1498.148909] env[67977]: DEBUG oslo_concurrency.lockutils [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] Lock "6fae5126-6618-4337-9a52-d6019727e0b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1498.149098] env[67977]: DEBUG oslo_concurrency.lockutils [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] Lock "6fae5126-6618-4337-9a52-d6019727e0b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1498.149270] env[67977]: DEBUG nova.compute.manager [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] No waiting events found dispatching network-vif-plugged-3664995e-eabb-4e28-ac81-bf3c252e91df {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1498.149435] env[67977]: WARNING nova.compute.manager [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Received unexpected event network-vif-plugged-3664995e-eabb-4e28-ac81-bf3c252e91df for instance with vm_state building and task_state spawning. [ 1498.149596] env[67977]: DEBUG nova.compute.manager [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Received event network-changed-3664995e-eabb-4e28-ac81-bf3c252e91df {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1498.149746] env[67977]: DEBUG nova.compute.manager [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Refreshing instance network info cache due to event network-changed-3664995e-eabb-4e28-ac81-bf3c252e91df. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1498.149952] env[67977]: DEBUG oslo_concurrency.lockutils [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] Acquiring lock "refresh_cache-6fae5126-6618-4337-9a52-d6019727e0b0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1498.150128] env[67977]: DEBUG oslo_concurrency.lockutils [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] Acquired lock "refresh_cache-6fae5126-6618-4337-9a52-d6019727e0b0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1498.150293] env[67977]: DEBUG nova.network.neutron [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Refreshing network info cache for port 3664995e-eabb-4e28-ac81-bf3c252e91df {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1498.394831] env[67977]: DEBUG nova.network.neutron [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Updated VIF entry in instance network info cache for port 3664995e-eabb-4e28-ac81-bf3c252e91df. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1498.395204] env[67977]: DEBUG nova.network.neutron [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Updating instance_info_cache with network_info: [{"id": "3664995e-eabb-4e28-ac81-bf3c252e91df", "address": "fa:16:3e:f0:dd:b8", "network": {"id": "4eece86d-f584-415a-bcdd-5bde739283be", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1877932954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "52e6b27298274fa1a10d95d9a967814b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "975b168a-03e5-449d-95ac-4d51ba027242", "external-id": "nsx-vlan-transportzone-365", "segmentation_id": 365, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3664995e-ea", "ovs_interfaceid": "3664995e-eabb-4e28-ac81-bf3c252e91df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1498.404554] env[67977]: DEBUG oslo_concurrency.lockutils [req-1d70a32a-e76f-4797-b290-c114e41bad19 req-4da40dac-6599-4ec5-861d-dfe1e050cb67 service nova] Releasing lock "refresh_cache-6fae5126-6618-4337-9a52-d6019727e0b0" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1498.482741] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468234, 'name': CreateVM_Task, 'duration_secs': 0.275444} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1498.485047] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1498.485047] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1498.485047] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1498.485047] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1498.485047] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d71ffad-08d0-4f8f-83d6-d0b41d391dc3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.488807] env[67977]: DEBUG oslo_vmware.api [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 1498.488807] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52bf2a62-5278-d095-6eee-315cb112d364" [ 1498.488807] env[67977]: _type = "Task" [ 1498.488807] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1498.496712] env[67977]: DEBUG oslo_vmware.api [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52bf2a62-5278-d095-6eee-315cb112d364, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1498.999844] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1499.000191] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1499.000418] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1509.682347] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "d1fc2ae5-fa11-41a7-808b-13da16667078" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1509.682649] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.776052] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1514.776439] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1515.771949] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1515.795513] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1516.774867] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1516.775114] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1516.775288] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1516.775453] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1516.787605] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1516.787828] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1516.788007] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1516.788190] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1516.789312] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a78b8a4b-8228-4250-85dd-64f107bfd2f2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.798334] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35a00b19-9470-42e3-bd7e-7ed5063eadaa {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.813076] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40f13d28-b0af-4e27-af05-27897930f932 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.819096] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5983bf6a-3b34-4004-b518-3617e7f6f077 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.847218] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180942MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1516.847354] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1516.847541] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1516.918438] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.918603] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.918732] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.918855] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.918977] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.919114] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.919233] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.919349] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.919463] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.919575] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6fae5126-6618-4337-9a52-d6019727e0b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1516.930019] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1516.939884] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1516.949094] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1516.958210] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1516.958423] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1516.958570] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1517.107247] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-556da771-d06c-4c0a-bff7-785fc17c247e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.114844] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4898cb4-fef3-4d69-aa05-7ce1a4026d2b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.144923] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08cdbdc6-6697-4158-bce6-ae80e0410e73 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.152827] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52cb1015-0184-4235-a4ad-59a3cacaabe2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.171971] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1517.181653] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1517.196188] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1517.196379] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.349s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1518.196584] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1518.196977] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1522.775079] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1522.775344] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1522.775416] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1522.794818] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.795025] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.795128] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.795259] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.795390] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.795516] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.795637] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.795758] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.795877] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.795994] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1522.796125] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1542.363651] env[67977]: WARNING oslo_vmware.rw_handles [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1542.363651] env[67977]: ERROR oslo_vmware.rw_handles [ 1542.364430] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/98294a6f-4379-44dd-b9a5-4a5655330974/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1542.365918] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1542.366251] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Copying Virtual Disk [datastore1] vmware_temp/98294a6f-4379-44dd-b9a5-4a5655330974/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/98294a6f-4379-44dd-b9a5-4a5655330974/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1542.366538] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4714a210-36cd-42ff-9c79-7d9c7ba64b0e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.375162] env[67977]: DEBUG oslo_vmware.api [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Waiting for the task: (returnval){ [ 1542.375162] env[67977]: value = "task-3468235" [ 1542.375162] env[67977]: _type = "Task" [ 1542.375162] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1542.382681] env[67977]: DEBUG oslo_vmware.api [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Task: {'id': task-3468235, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1542.885813] env[67977]: DEBUG oslo_vmware.exceptions [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1542.886133] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1542.886698] env[67977]: ERROR nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1542.886698] env[67977]: Faults: ['InvalidArgument'] [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Traceback (most recent call last): [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] yield resources [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] self.driver.spawn(context, instance, image_meta, [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] self._fetch_image_if_missing(context, vi) [ 1542.886698] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] image_cache(vi, tmp_image_ds_loc) [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] vm_util.copy_virtual_disk( [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] session._wait_for_task(vmdk_copy_task) [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] return self.wait_for_task(task_ref) [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] return evt.wait() [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] result = hub.switch() [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1542.887035] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] return self.greenlet.switch() [ 1542.887405] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1542.887405] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] self.f(*self.args, **self.kw) [ 1542.887405] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1542.887405] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] raise exceptions.translate_fault(task_info.error) [ 1542.887405] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1542.887405] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Faults: ['InvalidArgument'] [ 1542.887405] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] [ 1542.887405] env[67977]: INFO nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Terminating instance [ 1542.888665] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1542.888915] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1542.889864] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1542.890079] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1542.890307] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3302933a-3021-465c-87a7-b9a44e27c67c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.892629] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f284899-545c-4652-abf7-094c550cf783 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.899454] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1542.899674] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-928ce8bd-c7d6-4901-bf63-1c99f7ec0364 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.901844] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1542.902025] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1542.902986] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-26f95fae-a440-4323-84a2-2c32763d39fb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.907472] env[67977]: DEBUG oslo_vmware.api [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Waiting for the task: (returnval){ [ 1542.907472] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52dab445-6acd-c2d3-68af-0a8761b142d3" [ 1542.907472] env[67977]: _type = "Task" [ 1542.907472] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1542.919978] env[67977]: DEBUG oslo_vmware.api [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52dab445-6acd-c2d3-68af-0a8761b142d3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1542.971671] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1542.971913] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1542.972113] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Deleting the datastore file [datastore1] eae30b17-eea5-46aa-bb09-91ebca29ea6d {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1542.972399] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7bff5fd8-bdcc-47ef-b4de-81a3859d7342 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.978215] env[67977]: DEBUG oslo_vmware.api [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Waiting for the task: (returnval){ [ 1542.978215] env[67977]: value = "task-3468237" [ 1542.978215] env[67977]: _type = "Task" [ 1542.978215] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1542.985798] env[67977]: DEBUG oslo_vmware.api [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Task: {'id': task-3468237, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1543.417772] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1543.418115] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Creating directory with path [datastore1] vmware_temp/d122d031-754a-4a0c-9bf4-1140ac0c3f54/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1543.418222] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ae133007-23f9-436f-b577-22a22995780d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.430046] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Created directory with path [datastore1] vmware_temp/d122d031-754a-4a0c-9bf4-1140ac0c3f54/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1543.430238] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Fetch image to [datastore1] vmware_temp/d122d031-754a-4a0c-9bf4-1140ac0c3f54/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1543.430407] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/d122d031-754a-4a0c-9bf4-1140ac0c3f54/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1543.431144] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcdcdbab-e5b2-4ef6-864f-e76d625877d7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.437299] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b1aaf6a-243b-421f-bcfb-0e4ea9db7146 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.445967] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-579ec2ab-d04b-4c2a-af3b-b4985853eb7c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.475508] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c28718f-7aa6-4443-a5e9-93fdae00a8e7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.483320] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-594b0e59-76cc-4dcc-bb48-1258157e23d9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.487380] env[67977]: DEBUG oslo_vmware.api [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Task: {'id': task-3468237, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075758} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1543.487877] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1543.488085] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1543.488267] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1543.488438] env[67977]: INFO nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1543.490432] env[67977]: DEBUG nova.compute.claims [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1543.490597] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1543.490807] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1543.507871] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1543.643704] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1543.645345] env[67977]: ERROR nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 5ac2bac3-6c5c-4005-b6b0-349a1330d017. [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] result = getattr(controller, method)(*args, **kwargs) [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._get(image_id) [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1543.645345] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] resp, body = self.http_client.get(url, headers=header) [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.request(url, 'GET', **kwargs) [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._handle_response(resp) [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise exc.from_response(resp, resp.content) [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] During handling of the above exception, another exception occurred: [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1543.645700] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] yield resources [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self.driver.spawn(context, instance, image_meta, [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._fetch_image_if_missing(context, vi) [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] image_fetch(context, vi, tmp_image_ds_loc) [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] images.fetch_image( [ 1543.646047] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] metadata = IMAGE_API.get(context, image_ref) [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return session.show(context, image_id, [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] _reraise_translated_image_exception(image_id) [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise new_exc.with_traceback(exc_trace) [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] result = getattr(controller, method)(*args, **kwargs) [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1543.646417] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._get(image_id) [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] resp, body = self.http_client.get(url, headers=header) [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.request(url, 'GET', **kwargs) [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._handle_response(resp) [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise exc.from_response(resp, resp.content) [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] nova.exception.ImageNotAuthorized: Not authorized for image 5ac2bac3-6c5c-4005-b6b0-349a1330d017. [ 1543.646828] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1543.647107] env[67977]: INFO nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Terminating instance [ 1543.647162] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1543.647484] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1543.650100] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1543.650303] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1543.650546] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3f9762eb-fb1a-4bed-8a07-762a5e35e064 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.652971] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfacbe76-ba6a-4cf0-9f0e-8e6e16c72d86 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.660461] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1543.660701] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-71a1bcbf-68f7-4156-be82-c7818f68a698 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.663781] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1543.663978] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1543.666924] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b6f81aae-b82b-4436-97e7-495adf031c98 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.672466] env[67977]: DEBUG oslo_vmware.api [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Waiting for the task: (returnval){ [ 1543.672466] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d57e31-fa86-f07c-4501-b0c7d84760a7" [ 1543.672466] env[67977]: _type = "Task" [ 1543.672466] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1543.682284] env[67977]: DEBUG oslo_vmware.api [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d57e31-fa86-f07c-4501-b0c7d84760a7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1543.718057] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34522386-460f-4738-a03c-9c42f867d103 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.721688] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1543.721881] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1543.722069] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Deleting the datastore file [datastore1] 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1543.722652] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e7373643-69ad-48e5-8913-c62874daa6ed {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.727273] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcb4c6d5-7031-4f55-8c31-e0b995b2126b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.731045] env[67977]: DEBUG oslo_vmware.api [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Waiting for the task: (returnval){ [ 1543.731045] env[67977]: value = "task-3468239" [ 1543.731045] env[67977]: _type = "Task" [ 1543.731045] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1543.760786] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-977315df-9298-48fe-9f19-73e799e3b0e4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.765878] env[67977]: DEBUG oslo_vmware.api [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Task: {'id': task-3468239, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1543.770372] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a24474e-7a5a-47f9-9fa0-ccdcba9b969b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.783375] env[67977]: DEBUG nova.compute.provider_tree [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1543.791633] env[67977]: DEBUG nova.scheduler.client.report [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1543.805743] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.315s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1543.806308] env[67977]: ERROR nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1543.806308] env[67977]: Faults: ['InvalidArgument'] [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Traceback (most recent call last): [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] self.driver.spawn(context, instance, image_meta, [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] self._fetch_image_if_missing(context, vi) [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] image_cache(vi, tmp_image_ds_loc) [ 1543.806308] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] vm_util.copy_virtual_disk( [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] session._wait_for_task(vmdk_copy_task) [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] return self.wait_for_task(task_ref) [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] return evt.wait() [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] result = hub.switch() [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] return self.greenlet.switch() [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1543.806661] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] self.f(*self.args, **self.kw) [ 1543.807030] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1543.807030] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] raise exceptions.translate_fault(task_info.error) [ 1543.807030] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1543.807030] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Faults: ['InvalidArgument'] [ 1543.807030] env[67977]: ERROR nova.compute.manager [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] [ 1543.807166] env[67977]: DEBUG nova.compute.utils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1543.809034] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Build of instance eae30b17-eea5-46aa-bb09-91ebca29ea6d was re-scheduled: A specified parameter was not correct: fileType [ 1543.809034] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1543.809479] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1543.809740] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1543.809781] env[67977]: DEBUG nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1543.809939] env[67977]: DEBUG nova.network.neutron [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1544.185192] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1544.185417] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Creating directory with path [datastore1] vmware_temp/008aec59-38d4-47ce-9d46-0737430523dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1544.185638] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bf1b77fa-3a54-4b8f-8141-d1ee2ded6aa2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.196792] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Created directory with path [datastore1] vmware_temp/008aec59-38d4-47ce-9d46-0737430523dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1544.197034] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Fetch image to [datastore1] vmware_temp/008aec59-38d4-47ce-9d46-0737430523dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1544.197226] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/008aec59-38d4-47ce-9d46-0737430523dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1544.198034] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-455e67d2-371d-45c0-bc24-1b87e4ce6e28 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.204909] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18a7b187-2c95-4ec6-bbd2-a3b1c84e0182 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.214505] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fc43ccf-27b4-4c49-a16d-4413e524a29e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.259079] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6db75f2b-8d65-4225-8bed-2fd356b406df {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.268641] env[67977]: DEBUG oslo_vmware.api [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Task: {'id': task-3468239, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074267} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1544.271428] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1544.271641] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1544.271815] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1544.272016] env[67977]: INFO nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1544.273705] env[67977]: DEBUG nova.network.neutron [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1544.275208] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2ca002ce-88dd-4e2c-b3d0-498dc19dd209 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.277495] env[67977]: DEBUG nova.compute.claims [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1544.277692] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1544.278073] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1544.287597] env[67977]: INFO nova.compute.manager [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Took 0.48 seconds to deallocate network for instance. [ 1544.299732] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1544.378207] env[67977]: INFO nova.scheduler.client.report [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Deleted allocations for instance eae30b17-eea5-46aa-bb09-91ebca29ea6d [ 1544.402830] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4a2ded50-7aed-4cee-ae91-c6352413496f tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 675.224s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1544.404095] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 478.474s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1544.404343] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Acquiring lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1544.404548] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1544.404714] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1544.406566] env[67977]: INFO nova.compute.manager [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Terminating instance [ 1544.408161] env[67977]: DEBUG nova.compute.manager [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1544.408940] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1544.408940] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-22ace537-fbee-4389-9c99-f9c7eb246365 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.420669] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45feb5ad-e6c7-49c5-ac63-dc275811c38b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.437147] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1544.458969] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance eae30b17-eea5-46aa-bb09-91ebca29ea6d could not be found. [ 1544.459190] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1544.459365] env[67977]: INFO nova.compute.manager [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1544.459605] env[67977]: DEBUG oslo.service.loopingcall [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1544.463575] env[67977]: DEBUG nova.compute.manager [-] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1544.463702] env[67977]: DEBUG nova.network.neutron [-] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1544.480944] env[67977]: DEBUG oslo_vmware.rw_handles [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/008aec59-38d4-47ce-9d46-0737430523dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1544.538711] env[67977]: DEBUG nova.network.neutron [-] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1544.540337] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1544.543147] env[67977]: DEBUG oslo_vmware.rw_handles [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1544.543147] env[67977]: DEBUG oslo_vmware.rw_handles [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/008aec59-38d4-47ce-9d46-0737430523dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1544.548255] env[67977]: INFO nova.compute.manager [-] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] Took 0.08 seconds to deallocate network for instance. [ 1544.600830] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1574645a-323a-48b7-99b9-6a448934fdea {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.608766] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5f10ac3-7d22-46f8-8401-7bd6a2f996cd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.640839] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19779502-2032-4950-ab33-d9ac6520cf2a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.643636] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f3440270-df83-4170-8cd2-fad5dd94372d tempest-ImagesOneServerNegativeTestJSON-1861799384 tempest-ImagesOneServerNegativeTestJSON-1861799384-project-member] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.240s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1544.644695] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 245.919s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1544.644884] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: eae30b17-eea5-46aa-bb09-91ebca29ea6d] During sync_power_state the instance has a pending task (deleting). Skip. [ 1544.645110] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "eae30b17-eea5-46aa-bb09-91ebca29ea6d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1544.650208] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a73313c-e822-4b5a-b48c-680f59fb0d3c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.664169] env[67977]: DEBUG nova.compute.provider_tree [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1544.678709] env[67977]: DEBUG nova.scheduler.client.report [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1544.691098] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.413s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1544.691843] env[67977]: ERROR nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 5ac2bac3-6c5c-4005-b6b0-349a1330d017. [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] result = getattr(controller, method)(*args, **kwargs) [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._get(image_id) [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1544.691843] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] resp, body = self.http_client.get(url, headers=header) [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.request(url, 'GET', **kwargs) [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._handle_response(resp) [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise exc.from_response(resp, resp.content) [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] During handling of the above exception, another exception occurred: [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.692198] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self.driver.spawn(context, instance, image_meta, [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._fetch_image_if_missing(context, vi) [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] image_fetch(context, vi, tmp_image_ds_loc) [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] images.fetch_image( [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] metadata = IMAGE_API.get(context, image_ref) [ 1544.692528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return session.show(context, image_id, [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] _reraise_translated_image_exception(image_id) [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise new_exc.with_traceback(exc_trace) [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] result = getattr(controller, method)(*args, **kwargs) [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._get(image_id) [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1544.692874] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] resp, body = self.http_client.get(url, headers=header) [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.request(url, 'GET', **kwargs) [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._handle_response(resp) [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise exc.from_response(resp, resp.content) [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] nova.exception.ImageNotAuthorized: Not authorized for image 5ac2bac3-6c5c-4005-b6b0-349a1330d017. [ 1544.693265] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.693265] env[67977]: DEBUG nova.compute.utils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Not authorized for image 5ac2bac3-6c5c-4005-b6b0-349a1330d017. {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1544.693572] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.153s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1544.694985] env[67977]: INFO nova.compute.claims [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1544.697761] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Build of instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec was re-scheduled: Not authorized for image 5ac2bac3-6c5c-4005-b6b0-349a1330d017. {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1544.698247] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1544.698424] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1544.698584] env[67977]: DEBUG nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1544.698749] env[67977]: DEBUG nova.network.neutron [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1544.823199] env[67977]: DEBUG neutronclient.v2_0.client [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67977) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1544.825420] env[67977]: ERROR nova.compute.manager [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] result = getattr(controller, method)(*args, **kwargs) [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._get(image_id) [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1544.825420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] resp, body = self.http_client.get(url, headers=header) [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.request(url, 'GET', **kwargs) [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._handle_response(resp) [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise exc.from_response(resp, resp.content) [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] During handling of the above exception, another exception occurred: [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.825753] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self.driver.spawn(context, instance, image_meta, [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._fetch_image_if_missing(context, vi) [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] image_fetch(context, vi, tmp_image_ds_loc) [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] images.fetch_image( [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] metadata = IMAGE_API.get(context, image_ref) [ 1544.826088] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return session.show(context, image_id, [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] _reraise_translated_image_exception(image_id) [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise new_exc.with_traceback(exc_trace) [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] result = getattr(controller, method)(*args, **kwargs) [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._get(image_id) [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1544.826420] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] resp, body = self.http_client.get(url, headers=header) [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.request(url, 'GET', **kwargs) [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self._handle_response(resp) [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise exc.from_response(resp, resp.content) [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] nova.exception.ImageNotAuthorized: Not authorized for image 5ac2bac3-6c5c-4005-b6b0-349a1330d017. [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] During handling of the above exception, another exception occurred: [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.826766] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._build_and_run_instance(context, instance, image, [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise exception.RescheduledException( [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] nova.exception.RescheduledException: Build of instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec was re-scheduled: Not authorized for image 5ac2bac3-6c5c-4005-b6b0-349a1330d017. [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] During handling of the above exception, another exception occurred: [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1544.827120] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] exception_handler_v20(status_code, error_body) [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise client_exc(message=error_message, [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Neutron server returns request_ids: ['req-7673a570-3af4-4dfd-b341-7ee211d5c0c9'] [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] During handling of the above exception, another exception occurred: [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._deallocate_network(context, instance, requested_networks) [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self.network_api.deallocate_for_instance( [ 1544.827823] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] data = neutron.list_ports(**search_opts) [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.list('ports', self.ports_path, retrieve_all, [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] for r in self._pagination(collection, path, **params): [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] res = self.get(path, params=params) [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1544.828177] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.retry_request("GET", action, body=body, [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.do_request(method, action, body=body, [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._handle_fault_response(status_code, replybody, resp) [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise exception.Unauthorized() [ 1544.828565] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] nova.exception.Unauthorized: Not authorized. [ 1544.828868] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1544.878392] env[67977]: INFO nova.scheduler.client.report [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Deleted allocations for instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec [ 1544.895806] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3488201-f433-453c-900f-b58bb695caf2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.899585] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9ecde8c7-84d9-43f3-a344-f5775e184e54 tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 600.629s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1544.901557] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 403.924s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1544.901837] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Acquiring lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1544.902097] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1544.902285] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1544.904026] env[67977]: INFO nova.compute.manager [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Terminating instance [ 1544.908521] env[67977]: DEBUG nova.compute.manager [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1544.908729] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1544.909728] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b190bc8c-b053-41f5-bee3-3b9960fd8604 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.912846] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-42527a38-9af0-4f15-b342-44cadb9a1b45 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.943569] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1544.946657] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0720a1e-53b8-4205-a97d-6462436066dc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.952494] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d81b349-a0b9-4495-9a25-83c16b10ce46 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.967511] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b70e45c9-9e3c-4547-90e8-16d9d5f8a779 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.981175] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec could not be found. [ 1544.981397] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1544.981565] env[67977]: INFO nova.compute.manager [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Took 0.07 seconds to destroy the instance on the hypervisor. [ 1544.981813] env[67977]: DEBUG oslo.service.loopingcall [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1544.993172] env[67977]: DEBUG nova.compute.manager [-] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1544.993220] env[67977]: DEBUG nova.network.neutron [-] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1544.995278] env[67977]: DEBUG nova.compute.provider_tree [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1545.005026] env[67977]: DEBUG nova.scheduler.client.report [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1545.011990] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1545.019930] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1545.020440] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1545.023417] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.011s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1545.024982] env[67977]: INFO nova.compute.claims [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1545.063512] env[67977]: DEBUG nova.compute.utils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1545.065677] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1545.065677] env[67977]: DEBUG nova.network.neutron [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1545.073609] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1545.103793] env[67977]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67977) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1545.104206] env[67977]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-e9bb5fcb-5b18-40f9-94e8-07a6a2882529'] [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1545.105371] env[67977]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1545.105820] env[67977]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1545.106267] env[67977]: ERROR oslo.service.loopingcall [ 1545.106659] env[67977]: ERROR nova.compute.manager [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1545.140485] env[67977]: ERROR nova.compute.manager [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] exception_handler_v20(status_code, error_body) [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise client_exc(message=error_message, [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Neutron server returns request_ids: ['req-e9bb5fcb-5b18-40f9-94e8-07a6a2882529'] [ 1545.140485] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] During handling of the above exception, another exception occurred: [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Traceback (most recent call last): [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._delete_instance(context, instance, bdms) [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._shutdown_instance(context, instance, bdms) [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._try_deallocate_network(context, instance, requested_networks) [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] with excutils.save_and_reraise_exception(): [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1545.140866] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self.force_reraise() [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise self.value [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] _deallocate_network_with_retries() [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return evt.wait() [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] result = hub.switch() [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.greenlet.switch() [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1545.141235] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] result = func(*self.args, **self.kw) [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] result = f(*args, **kwargs) [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._deallocate_network( [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self.network_api.deallocate_for_instance( [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] data = neutron.list_ports(**search_opts) [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.list('ports', self.ports_path, retrieve_all, [ 1545.141528] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] for r in self._pagination(collection, path, **params): [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] res = self.get(path, params=params) [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.retry_request("GET", action, body=body, [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1545.141876] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] return self.do_request(method, action, body=body, [ 1545.142246] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.142246] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] ret = obj(*args, **kwargs) [ 1545.142246] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1545.142246] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] self._handle_fault_response(status_code, replybody, resp) [ 1545.142246] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1545.142246] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1545.142246] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1545.142246] env[67977]: ERROR nova.compute.manager [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] [ 1545.144396] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1545.161161] env[67977]: DEBUG nova.policy [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '766cde5830814a7396549aa7288a0aed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf3e35c829af479dbea74ebb00553ca4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1545.174277] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1545.174967] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1545.174967] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1545.174967] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1545.175151] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1545.175188] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1545.175430] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1545.175597] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1545.175767] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1545.175946] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1545.176113] env[67977]: DEBUG nova.virt.hardware [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1545.176989] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f1ad08a-a462-484e-8e6e-5f040b1b4b91 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.183617] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.282s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1545.185778] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 246.459s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1545.185778] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] During sync_power_state the instance has a pending task (deleting). Skip. [ 1545.185778] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1545.191025] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-227d753f-f9fd-4848-a02a-c15da6fd8410 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.245947] env[67977]: INFO nova.compute.manager [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] [instance: 665e9acb-c48d-42c3-8cbc-bd2fe8f7b5ec] Successfully reverted task state from None on failure for instance. [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server [None req-6dde271f-e34a-4f83-922a-0ec85d8efaff tempest-MigrationsAdminTest-956232985 tempest-MigrationsAdminTest-956232985-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-e9bb5fcb-5b18-40f9-94e8-07a6a2882529'] [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1545.250691] env[67977]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server raise self.value [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1545.251124] env[67977]: ERROR oslo_messaging.rpc.server raise self.value [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server raise self.value [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1545.251589] env[67977]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server raise self.value [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server raise self.value [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1545.252083] env[67977]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.252519] env[67977]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1545.253038] env[67977]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1545.253512] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1545.253512] env[67977]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1545.253512] env[67977]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1545.253512] env[67977]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1545.253512] env[67977]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1545.253512] env[67977]: ERROR oslo_messaging.rpc.server [ 1545.263662] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6f59277-e0c4-496e-87a6-6cce83b1d2fb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.271471] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10bea987-fcdf-4b96-84c0-56704737f8e0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.304879] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4288546-9b22-48ed-95eb-3117e2068a59 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.312170] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acade9d7-d353-4b5f-8e23-e4f8f1a86a5c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.325515] env[67977]: DEBUG nova.compute.provider_tree [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1545.334368] env[67977]: DEBUG nova.scheduler.client.report [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1545.351770] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1545.352133] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1545.402138] env[67977]: DEBUG nova.compute.utils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1545.403492] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1545.403673] env[67977]: DEBUG nova.network.neutron [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1545.416798] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1545.458705] env[67977]: DEBUG nova.policy [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4965d451810c48458246493019d83172', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d528c04bd83409eb74e20393651c040', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1545.503959] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1545.528657] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1545.528896] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1545.529104] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1545.529308] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1545.529454] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1545.529603] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1545.529814] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1545.529976] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1545.530166] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1545.530332] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1545.530508] env[67977]: DEBUG nova.virt.hardware [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1545.531370] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca7540a3-e6e4-4091-9ba4-d1b46bbc69c6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.539773] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85b0dd6e-37a0-47fb-8411-b531fc9733d6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.605809] env[67977]: DEBUG nova.network.neutron [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Successfully created port: b13751bf-7207-45d2-b0a1-d8608013462d {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1545.752604] env[67977]: DEBUG nova.network.neutron [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Successfully created port: 43e349c2-b417-44e1-89a9-5b15a09faade {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1546.166017] env[67977]: DEBUG nova.compute.manager [req-130fe43f-40f5-4262-bf2c-81dc75359aa9 req-dd30764a-15ee-41ad-9546-6877d163a831 service nova] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Received event network-vif-plugged-b13751bf-7207-45d2-b0a1-d8608013462d {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1546.166017] env[67977]: DEBUG oslo_concurrency.lockutils [req-130fe43f-40f5-4262-bf2c-81dc75359aa9 req-dd30764a-15ee-41ad-9546-6877d163a831 service nova] Acquiring lock "b56ab7a8-cd27-4542-8082-ec023c57e153-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1546.166017] env[67977]: DEBUG oslo_concurrency.lockutils [req-130fe43f-40f5-4262-bf2c-81dc75359aa9 req-dd30764a-15ee-41ad-9546-6877d163a831 service nova] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1546.166017] env[67977]: DEBUG oslo_concurrency.lockutils [req-130fe43f-40f5-4262-bf2c-81dc75359aa9 req-dd30764a-15ee-41ad-9546-6877d163a831 service nova] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1546.166176] env[67977]: DEBUG nova.compute.manager [req-130fe43f-40f5-4262-bf2c-81dc75359aa9 req-dd30764a-15ee-41ad-9546-6877d163a831 service nova] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] No waiting events found dispatching network-vif-plugged-b13751bf-7207-45d2-b0a1-d8608013462d {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1546.166339] env[67977]: WARNING nova.compute.manager [req-130fe43f-40f5-4262-bf2c-81dc75359aa9 req-dd30764a-15ee-41ad-9546-6877d163a831 service nova] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Received unexpected event network-vif-plugged-b13751bf-7207-45d2-b0a1-d8608013462d for instance with vm_state building and task_state spawning. [ 1546.220634] env[67977]: DEBUG nova.network.neutron [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Successfully updated port: b13751bf-7207-45d2-b0a1-d8608013462d {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1546.230746] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "refresh_cache-b56ab7a8-cd27-4542-8082-ec023c57e153" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1546.231656] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquired lock "refresh_cache-b56ab7a8-cd27-4542-8082-ec023c57e153" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1546.233575] env[67977]: DEBUG nova.network.neutron [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1546.274152] env[67977]: DEBUG nova.network.neutron [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1546.345776] env[67977]: DEBUG nova.compute.manager [req-edc744bf-256e-468a-b264-11696411563c req-6140c879-145e-427a-9b27-b8d0374ac764 service nova] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Received event network-vif-plugged-43e349c2-b417-44e1-89a9-5b15a09faade {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1546.346030] env[67977]: DEBUG oslo_concurrency.lockutils [req-edc744bf-256e-468a-b264-11696411563c req-6140c879-145e-427a-9b27-b8d0374ac764 service nova] Acquiring lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1546.346230] env[67977]: DEBUG oslo_concurrency.lockutils [req-edc744bf-256e-468a-b264-11696411563c req-6140c879-145e-427a-9b27-b8d0374ac764 service nova] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1546.346399] env[67977]: DEBUG oslo_concurrency.lockutils [req-edc744bf-256e-468a-b264-11696411563c req-6140c879-145e-427a-9b27-b8d0374ac764 service nova] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1546.346561] env[67977]: DEBUG nova.compute.manager [req-edc744bf-256e-468a-b264-11696411563c req-6140c879-145e-427a-9b27-b8d0374ac764 service nova] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] No waiting events found dispatching network-vif-plugged-43e349c2-b417-44e1-89a9-5b15a09faade {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1546.346721] env[67977]: WARNING nova.compute.manager [req-edc744bf-256e-468a-b264-11696411563c req-6140c879-145e-427a-9b27-b8d0374ac764 service nova] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Received unexpected event network-vif-plugged-43e349c2-b417-44e1-89a9-5b15a09faade for instance with vm_state building and task_state spawning. [ 1546.465124] env[67977]: DEBUG nova.network.neutron [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Successfully updated port: 43e349c2-b417-44e1-89a9-5b15a09faade {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1546.478048] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "refresh_cache-ac4fe863-2435-48ed-9c7c-9e7144be8e70" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1546.478201] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired lock "refresh_cache-ac4fe863-2435-48ed-9c7c-9e7144be8e70" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1546.478355] env[67977]: DEBUG nova.network.neutron [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1546.513417] env[67977]: DEBUG nova.network.neutron [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Updating instance_info_cache with network_info: [{"id": "b13751bf-7207-45d2-b0a1-d8608013462d", "address": "fa:16:3e:e6:f6:94", "network": {"id": "1164109a-0805-4699-b73b-2f458affef73", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-646858955-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf3e35c829af479dbea74ebb00553ca4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f8442aa5-73db-4599-8564-b98a6ea26b9b", "external-id": "nsx-vlan-transportzone-893", "segmentation_id": 893, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13751bf-72", "ovs_interfaceid": "b13751bf-7207-45d2-b0a1-d8608013462d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1546.527335] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Releasing lock "refresh_cache-b56ab7a8-cd27-4542-8082-ec023c57e153" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1546.527652] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Instance network_info: |[{"id": "b13751bf-7207-45d2-b0a1-d8608013462d", "address": "fa:16:3e:e6:f6:94", "network": {"id": "1164109a-0805-4699-b73b-2f458affef73", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-646858955-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf3e35c829af479dbea74ebb00553ca4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f8442aa5-73db-4599-8564-b98a6ea26b9b", "external-id": "nsx-vlan-transportzone-893", "segmentation_id": 893, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13751bf-72", "ovs_interfaceid": "b13751bf-7207-45d2-b0a1-d8608013462d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1546.528048] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e6:f6:94', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f8442aa5-73db-4599-8564-b98a6ea26b9b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b13751bf-7207-45d2-b0a1-d8608013462d', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1546.540955] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Creating folder: Project (bf3e35c829af479dbea74ebb00553ca4). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1546.541934] env[67977]: DEBUG nova.network.neutron [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1546.544050] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dfcca808-4b8d-4d12-a737-c0b0e0bf9d7f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.554979] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Created folder: Project (bf3e35c829af479dbea74ebb00553ca4) in parent group-v693022. [ 1546.555214] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Creating folder: Instances. Parent ref: group-v693105. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1546.556013] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f1d43e3a-b098-4cc2-96c1-dfc4f669a94a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.566775] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Created folder: Instances in parent group-v693105. [ 1546.566978] env[67977]: DEBUG oslo.service.loopingcall [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1546.567190] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1546.569632] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dc9ad425-ae9f-461c-9a9b-55ebda762d46 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.591890] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1546.591890] env[67977]: value = "task-3468242" [ 1546.591890] env[67977]: _type = "Task" [ 1546.591890] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1546.599419] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468242, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1546.815363] env[67977]: DEBUG nova.network.neutron [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Updating instance_info_cache with network_info: [{"id": "43e349c2-b417-44e1-89a9-5b15a09faade", "address": "fa:16:3e:18:80:a0", "network": {"id": "da4332ac-6c00-499e-81de-2b64bd556acc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1328497137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d528c04bd83409eb74e20393651c040", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ded18042-834c-4792-b3e8-b1c377446432", "external-id": "nsx-vlan-transportzone-293", "segmentation_id": 293, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43e349c2-b4", "ovs_interfaceid": "43e349c2-b417-44e1-89a9-5b15a09faade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1546.826700] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Releasing lock "refresh_cache-ac4fe863-2435-48ed-9c7c-9e7144be8e70" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1546.826997] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Instance network_info: |[{"id": "43e349c2-b417-44e1-89a9-5b15a09faade", "address": "fa:16:3e:18:80:a0", "network": {"id": "da4332ac-6c00-499e-81de-2b64bd556acc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1328497137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d528c04bd83409eb74e20393651c040", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ded18042-834c-4792-b3e8-b1c377446432", "external-id": "nsx-vlan-transportzone-293", "segmentation_id": 293, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43e349c2-b4", "ovs_interfaceid": "43e349c2-b417-44e1-89a9-5b15a09faade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1546.827395] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:18:80:a0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ded18042-834c-4792-b3e8-b1c377446432', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '43e349c2-b417-44e1-89a9-5b15a09faade', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1546.835178] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Creating folder: Project (1d528c04bd83409eb74e20393651c040). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1546.836049] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-939a3416-6df3-43be-8c4b-85e967e845f1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.846349] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Created folder: Project (1d528c04bd83409eb74e20393651c040) in parent group-v693022. [ 1546.846537] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Creating folder: Instances. Parent ref: group-v693108. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1546.846764] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-070fbcf1-471e-4cea-99e4-5d2d72079a4a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.856406] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Created folder: Instances in parent group-v693108. [ 1546.856628] env[67977]: DEBUG oslo.service.loopingcall [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1546.856806] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1546.857008] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8f3dff1c-b390-4864-918a-ea309a5d53e4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.880747] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1546.880747] env[67977]: value = "task-3468245" [ 1546.880747] env[67977]: _type = "Task" [ 1546.880747] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1546.888573] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468245, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1547.103088] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468242, 'name': CreateVM_Task, 'duration_secs': 0.316296} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1547.103295] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1547.104015] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1547.104221] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1547.104590] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1547.104880] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b1548a01-ca6b-42d2-8fca-abf4e2fd167a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.109785] env[67977]: DEBUG oslo_vmware.api [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for the task: (returnval){ [ 1547.109785] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ce8c0e-8efd-695c-2536-19c5084c62d1" [ 1547.109785] env[67977]: _type = "Task" [ 1547.109785] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1547.117848] env[67977]: DEBUG oslo_vmware.api [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ce8c0e-8efd-695c-2536-19c5084c62d1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1547.390990] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468245, 'name': CreateVM_Task, 'duration_secs': 0.271831} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1547.391219] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1547.391865] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1547.619754] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1547.620037] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1547.620261] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1547.620480] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1547.620779] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1547.621086] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac0ec04f-ca26-406b-814e-c1948f8bf04a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.625734] env[67977]: DEBUG oslo_vmware.api [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for the task: (returnval){ [ 1547.625734] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52b4a809-d9c6-2db5-84f8-3b0da82a140c" [ 1547.625734] env[67977]: _type = "Task" [ 1547.625734] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1547.632756] env[67977]: DEBUG oslo_vmware.api [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52b4a809-d9c6-2db5-84f8-3b0da82a140c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1548.136398] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1548.136681] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1548.136900] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1548.188933] env[67977]: DEBUG nova.compute.manager [req-c62c551a-82e8-4000-83a5-4cb1813027dc req-6635fc58-aec4-4119-8b36-6d3cc3bf323f service nova] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Received event network-changed-b13751bf-7207-45d2-b0a1-d8608013462d {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1548.189215] env[67977]: DEBUG nova.compute.manager [req-c62c551a-82e8-4000-83a5-4cb1813027dc req-6635fc58-aec4-4119-8b36-6d3cc3bf323f service nova] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Refreshing instance network info cache due to event network-changed-b13751bf-7207-45d2-b0a1-d8608013462d. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1548.189452] env[67977]: DEBUG oslo_concurrency.lockutils [req-c62c551a-82e8-4000-83a5-4cb1813027dc req-6635fc58-aec4-4119-8b36-6d3cc3bf323f service nova] Acquiring lock "refresh_cache-b56ab7a8-cd27-4542-8082-ec023c57e153" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1548.189622] env[67977]: DEBUG oslo_concurrency.lockutils [req-c62c551a-82e8-4000-83a5-4cb1813027dc req-6635fc58-aec4-4119-8b36-6d3cc3bf323f service nova] Acquired lock "refresh_cache-b56ab7a8-cd27-4542-8082-ec023c57e153" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1548.189794] env[67977]: DEBUG nova.network.neutron [req-c62c551a-82e8-4000-83a5-4cb1813027dc req-6635fc58-aec4-4119-8b36-6d3cc3bf323f service nova] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Refreshing network info cache for port b13751bf-7207-45d2-b0a1-d8608013462d {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1548.375173] env[67977]: DEBUG nova.compute.manager [req-4c59acde-b33b-4abb-9a33-8a4857643833 req-35a8321b-7673-4be0-b67f-29f758c8c734 service nova] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Received event network-changed-43e349c2-b417-44e1-89a9-5b15a09faade {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1548.375424] env[67977]: DEBUG nova.compute.manager [req-4c59acde-b33b-4abb-9a33-8a4857643833 req-35a8321b-7673-4be0-b67f-29f758c8c734 service nova] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Refreshing instance network info cache due to event network-changed-43e349c2-b417-44e1-89a9-5b15a09faade. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1548.375690] env[67977]: DEBUG oslo_concurrency.lockutils [req-4c59acde-b33b-4abb-9a33-8a4857643833 req-35a8321b-7673-4be0-b67f-29f758c8c734 service nova] Acquiring lock "refresh_cache-ac4fe863-2435-48ed-9c7c-9e7144be8e70" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1548.375889] env[67977]: DEBUG oslo_concurrency.lockutils [req-4c59acde-b33b-4abb-9a33-8a4857643833 req-35a8321b-7673-4be0-b67f-29f758c8c734 service nova] Acquired lock "refresh_cache-ac4fe863-2435-48ed-9c7c-9e7144be8e70" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1548.376359] env[67977]: DEBUG nova.network.neutron [req-4c59acde-b33b-4abb-9a33-8a4857643833 req-35a8321b-7673-4be0-b67f-29f758c8c734 service nova] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Refreshing network info cache for port 43e349c2-b417-44e1-89a9-5b15a09faade {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1548.437119] env[67977]: DEBUG nova.network.neutron [req-c62c551a-82e8-4000-83a5-4cb1813027dc req-6635fc58-aec4-4119-8b36-6d3cc3bf323f service nova] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Updated VIF entry in instance network info cache for port b13751bf-7207-45d2-b0a1-d8608013462d. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1548.437549] env[67977]: DEBUG nova.network.neutron [req-c62c551a-82e8-4000-83a5-4cb1813027dc req-6635fc58-aec4-4119-8b36-6d3cc3bf323f service nova] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Updating instance_info_cache with network_info: [{"id": "b13751bf-7207-45d2-b0a1-d8608013462d", "address": "fa:16:3e:e6:f6:94", "network": {"id": "1164109a-0805-4699-b73b-2f458affef73", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-646858955-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf3e35c829af479dbea74ebb00553ca4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f8442aa5-73db-4599-8564-b98a6ea26b9b", "external-id": "nsx-vlan-transportzone-893", "segmentation_id": 893, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13751bf-72", "ovs_interfaceid": "b13751bf-7207-45d2-b0a1-d8608013462d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1548.447086] env[67977]: DEBUG oslo_concurrency.lockutils [req-c62c551a-82e8-4000-83a5-4cb1813027dc req-6635fc58-aec4-4119-8b36-6d3cc3bf323f service nova] Releasing lock "refresh_cache-b56ab7a8-cd27-4542-8082-ec023c57e153" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1548.623628] env[67977]: DEBUG nova.network.neutron [req-4c59acde-b33b-4abb-9a33-8a4857643833 req-35a8321b-7673-4be0-b67f-29f758c8c734 service nova] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Updated VIF entry in instance network info cache for port 43e349c2-b417-44e1-89a9-5b15a09faade. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1548.624018] env[67977]: DEBUG nova.network.neutron [req-4c59acde-b33b-4abb-9a33-8a4857643833 req-35a8321b-7673-4be0-b67f-29f758c8c734 service nova] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Updating instance_info_cache with network_info: [{"id": "43e349c2-b417-44e1-89a9-5b15a09faade", "address": "fa:16:3e:18:80:a0", "network": {"id": "da4332ac-6c00-499e-81de-2b64bd556acc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1328497137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d528c04bd83409eb74e20393651c040", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ded18042-834c-4792-b3e8-b1c377446432", "external-id": "nsx-vlan-transportzone-293", "segmentation_id": 293, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43e349c2-b4", "ovs_interfaceid": "43e349c2-b417-44e1-89a9-5b15a09faade", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1548.633730] env[67977]: DEBUG oslo_concurrency.lockutils [req-4c59acde-b33b-4abb-9a33-8a4857643833 req-35a8321b-7673-4be0-b67f-29f758c8c734 service nova] Releasing lock "refresh_cache-ac4fe863-2435-48ed-9c7c-9e7144be8e70" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1557.772367] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "511896d4-d9cb-42e0-b213-31be3cac191c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1557.772367] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "511896d4-d9cb-42e0-b213-31be3cac191c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1561.213103] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "6fae5126-6618-4337-9a52-d6019727e0b0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1575.775668] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1575.775912] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1576.775584] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1576.775831] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1576.776062] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1576.776185] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances with incomplete migration {{(pid=67977) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1576.983508] env[67977]: DEBUG oslo_concurrency.lockutils [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "b56ab7a8-cd27-4542-8082-ec023c57e153" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1577.783278] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1577.783278] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1577.783278] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1577.783278] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1577.793578] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1577.793813] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1577.793998] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1577.794169] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1577.795273] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79829e55-5b44-43ee-9498-b906e5de8b77 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.804852] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-502b1360-1c97-40f3-b38d-58de117407e8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.818060] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-543318ad-5dc0-4cea-a276-f2b5473195e4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.824158] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c17cc27-dd44-4d48-bdb3-0d7db1eb3109 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.852305] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180925MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1577.852453] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1577.852645] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1577.953079] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 32d860b3-f438-400f-8296-e62cc662d618 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.953268] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.953403] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.953527] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.953695] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.953826] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.953945] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.954077] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6fae5126-6618-4337-9a52-d6019727e0b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.954197] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.954315] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1577.965408] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1577.975707] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1577.984899] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1577.985116] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1577.985268] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1578.001301] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing inventories for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1578.015963] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating ProviderTree inventory for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1578.016196] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating inventory in ProviderTree for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1578.026395] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing aggregate associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, aggregates: None {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1578.043344] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing trait associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1578.204368] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b588ff8-fbe1-42ef-b6da-57e91921f2d0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.211807] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bea3b82c-158d-43e9-9c5b-0df56f75a34c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.242862] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b15b163-e405-442b-9a29-b17913d5ab49 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.249876] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e65cd46-7d1f-44a8-94e0-801606f699d8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.262469] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1578.270454] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1578.283518] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1578.283685] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.431s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1579.276644] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1584.777101] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1584.777359] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1584.777435] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1584.797553] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.797701] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.797838] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.797969] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.798107] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.798231] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.798350] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.798469] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.798585] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.798752] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1584.798837] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1588.777606] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1589.785266] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1589.785686] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1589.794789] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] There are 0 instances to clean {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1591.230207] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1593.137059] env[67977]: WARNING oslo_vmware.rw_handles [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1593.137059] env[67977]: ERROR oslo_vmware.rw_handles [ 1593.137059] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/008aec59-38d4-47ce-9d46-0737430523dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1593.138507] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1593.138747] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Copying Virtual Disk [datastore1] vmware_temp/008aec59-38d4-47ce-9d46-0737430523dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/008aec59-38d4-47ce-9d46-0737430523dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1593.139047] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-211e2931-ab5d-43d0-9291-c703c92b21c3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1593.147082] env[67977]: DEBUG oslo_vmware.api [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Waiting for the task: (returnval){ [ 1593.147082] env[67977]: value = "task-3468246" [ 1593.147082] env[67977]: _type = "Task" [ 1593.147082] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1593.155165] env[67977]: DEBUG oslo_vmware.api [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Task: {'id': task-3468246, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1593.657199] env[67977]: DEBUG oslo_vmware.exceptions [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1593.657496] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1593.658059] env[67977]: ERROR nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1593.658059] env[67977]: Faults: ['InvalidArgument'] [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] Traceback (most recent call last): [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] yield resources [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] self.driver.spawn(context, instance, image_meta, [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] self._fetch_image_if_missing(context, vi) [ 1593.658059] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] image_cache(vi, tmp_image_ds_loc) [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] vm_util.copy_virtual_disk( [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] session._wait_for_task(vmdk_copy_task) [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] return self.wait_for_task(task_ref) [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] return evt.wait() [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] result = hub.switch() [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1593.658376] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] return self.greenlet.switch() [ 1593.658687] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1593.658687] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] self.f(*self.args, **self.kw) [ 1593.658687] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1593.658687] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] raise exceptions.translate_fault(task_info.error) [ 1593.658687] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1593.658687] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] Faults: ['InvalidArgument'] [ 1593.658687] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] [ 1593.658687] env[67977]: INFO nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Terminating instance [ 1593.659861] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1593.660080] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1593.660318] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c24285c-121c-4530-be97-b5e168d8729a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1593.663459] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1593.663666] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1593.664439] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6682b7c5-37fe-4749-a870-d24d9edec35e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1593.671295] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1593.671522] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a8fdb1ba-f9d9-4ad0-aa06-0a86abdacdc5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1593.673626] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1593.673799] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1593.674771] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-28449d28-ce20-4fd5-95ff-bd67fec75a8c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1593.679630] env[67977]: DEBUG oslo_vmware.api [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Waiting for the task: (returnval){ [ 1593.679630] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5223cd9c-6b6e-a9ff-dd88-8c2397ec2543" [ 1593.679630] env[67977]: _type = "Task" [ 1593.679630] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1593.686500] env[67977]: DEBUG oslo_vmware.api [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5223cd9c-6b6e-a9ff-dd88-8c2397ec2543, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1593.736195] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1593.736385] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1593.736542] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Deleting the datastore file [datastore1] 32d860b3-f438-400f-8296-e62cc662d618 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1593.736793] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-72956d25-0c20-4b80-b3c1-8771072f012d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1593.742835] env[67977]: DEBUG oslo_vmware.api [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Waiting for the task: (returnval){ [ 1593.742835] env[67977]: value = "task-3468248" [ 1593.742835] env[67977]: _type = "Task" [ 1593.742835] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1593.749959] env[67977]: DEBUG oslo_vmware.api [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Task: {'id': task-3468248, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1594.189672] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1594.191447] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Creating directory with path [datastore1] vmware_temp/a22ce218-0b83-4ab1-b368-2f770265dca8/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1594.191447] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-da1a87dd-545b-4ae8-9399-f51276ab85b5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.201320] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Created directory with path [datastore1] vmware_temp/a22ce218-0b83-4ab1-b368-2f770265dca8/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1594.201504] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Fetch image to [datastore1] vmware_temp/a22ce218-0b83-4ab1-b368-2f770265dca8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1594.201672] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/a22ce218-0b83-4ab1-b368-2f770265dca8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1594.202394] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1768f77-643a-462d-9b56-167760f69bc4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.208521] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63180a2a-9186-40e6-960f-3210c4fa2d61 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.217325] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc40ec4b-8c2d-4b28-b969-b30e6dcb394f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.250198] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6134a071-f2e8-4065-8fa6-2d33943dfc05 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.257263] env[67977]: DEBUG oslo_vmware.api [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Task: {'id': task-3468248, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073988} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1594.258680] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1594.258877] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1594.259062] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1594.259242] env[67977]: INFO nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1594.260973] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e534af19-d899-41d5-a66f-308e1f0b59de {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.263373] env[67977]: DEBUG nova.compute.claims [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1594.263577] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1594.263767] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1594.289630] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1594.339168] env[67977]: DEBUG oslo_vmware.rw_handles [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a22ce218-0b83-4ab1-b368-2f770265dca8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1594.397777] env[67977]: DEBUG oslo_vmware.rw_handles [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1594.397958] env[67977]: DEBUG oslo_vmware.rw_handles [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a22ce218-0b83-4ab1-b368-2f770265dca8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1594.510319] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc6e0368-f630-4cc9-9962-b66b1c6321ee {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.517802] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4646f7b1-9585-4cd8-b96a-558eab3a2d02 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.546705] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08f4ed96-772f-46b3-9587-8e0d1758371f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.553519] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8495de08-d000-4238-bdad-4b235d6d6ecf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.566066] env[67977]: DEBUG nova.compute.provider_tree [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1594.574575] env[67977]: DEBUG nova.scheduler.client.report [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1594.587883] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.324s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1594.588411] env[67977]: ERROR nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1594.588411] env[67977]: Faults: ['InvalidArgument'] [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] Traceback (most recent call last): [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] self.driver.spawn(context, instance, image_meta, [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] self._fetch_image_if_missing(context, vi) [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] image_cache(vi, tmp_image_ds_loc) [ 1594.588411] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] vm_util.copy_virtual_disk( [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] session._wait_for_task(vmdk_copy_task) [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] return self.wait_for_task(task_ref) [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] return evt.wait() [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] result = hub.switch() [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] return self.greenlet.switch() [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1594.588818] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] self.f(*self.args, **self.kw) [ 1594.589164] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1594.589164] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] raise exceptions.translate_fault(task_info.error) [ 1594.589164] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1594.589164] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] Faults: ['InvalidArgument'] [ 1594.589164] env[67977]: ERROR nova.compute.manager [instance: 32d860b3-f438-400f-8296-e62cc662d618] [ 1594.589164] env[67977]: DEBUG nova.compute.utils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1594.590440] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Build of instance 32d860b3-f438-400f-8296-e62cc662d618 was re-scheduled: A specified parameter was not correct: fileType [ 1594.590440] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1594.590800] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1594.590974] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1594.591166] env[67977]: DEBUG nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1594.591331] env[67977]: DEBUG nova.network.neutron [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1594.964320] env[67977]: DEBUG nova.network.neutron [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1594.981090] env[67977]: INFO nova.compute.manager [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Took 0.39 seconds to deallocate network for instance. [ 1595.087415] env[67977]: INFO nova.scheduler.client.report [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Deleted allocations for instance 32d860b3-f438-400f-8296-e62cc662d618 [ 1595.110762] env[67977]: DEBUG oslo_concurrency.lockutils [None req-04e91773-4769-4e51-a3dc-d1d740b83f55 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "32d860b3-f438-400f-8296-e62cc662d618" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 632.709s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1595.111984] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "32d860b3-f438-400f-8296-e62cc662d618" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 435.512s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1595.112135] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Acquiring lock "32d860b3-f438-400f-8296-e62cc662d618-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1595.112340] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "32d860b3-f438-400f-8296-e62cc662d618-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1595.112496] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "32d860b3-f438-400f-8296-e62cc662d618-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1595.114558] env[67977]: INFO nova.compute.manager [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Terminating instance [ 1595.116979] env[67977]: DEBUG nova.compute.manager [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1595.116979] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1595.116979] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-59cd9f37-9920-4f91-bb9d-9ccc87fb82de {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.127525] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc70ece6-57c8-4aef-866b-4e825d552f82 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.138566] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1595.158898] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 32d860b3-f438-400f-8296-e62cc662d618 could not be found. [ 1595.158898] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1595.159053] env[67977]: INFO nova.compute.manager [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1595.159871] env[67977]: DEBUG oslo.service.loopingcall [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1595.159871] env[67977]: DEBUG nova.compute.manager [-] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1595.159871] env[67977]: DEBUG nova.network.neutron [-] [instance: 32d860b3-f438-400f-8296-e62cc662d618] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1595.189075] env[67977]: DEBUG nova.network.neutron [-] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1595.193418] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1595.193667] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1595.195498] env[67977]: INFO nova.compute.claims [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1595.198357] env[67977]: INFO nova.compute.manager [-] [instance: 32d860b3-f438-400f-8296-e62cc662d618] Took 0.04 seconds to deallocate network for instance. [ 1595.287284] env[67977]: DEBUG oslo_concurrency.lockutils [None req-6d5f98c1-34c4-4343-8fe8-230361844f96 tempest-VolumesAdminNegativeTest-966207661 tempest-VolumesAdminNegativeTest-966207661-project-member] Lock "32d860b3-f438-400f-8296-e62cc662d618" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1595.289061] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "32d860b3-f438-400f-8296-e62cc662d618" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 296.562s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1595.289325] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 32d860b3-f438-400f-8296-e62cc662d618] During sync_power_state the instance has a pending task (deleting). Skip. [ 1595.289528] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "32d860b3-f438-400f-8296-e62cc662d618" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1595.377585] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f93aab10-d6e2-40e8-ba47-7ca216805d66 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.385709] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bace7a2-dd64-4720-bfca-f4b6ffb0ed8d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.415471] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-688967aa-6699-42a2-97c0-e505b0cbdfce {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.422317] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d99ec0ec-59d6-47a4-8a6f-c1cf473323bc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.435011] env[67977]: DEBUG nova.compute.provider_tree [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1595.443401] env[67977]: DEBUG nova.scheduler.client.report [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1595.457781] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1595.458236] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1595.489248] env[67977]: DEBUG nova.compute.utils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1595.490688] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1595.490905] env[67977]: DEBUG nova.network.neutron [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1595.499214] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1595.571057] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1595.574779] env[67977]: DEBUG nova.policy [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9edadfabae414eb9843451bbb1b931ad', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb5c71d2daaa48f09f9f32a17b9d41c6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1595.596885] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1595.597154] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1595.597323] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1595.597511] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1595.597662] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1595.597811] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1595.598037] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1595.598210] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1595.598388] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1595.598606] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1595.598821] env[67977]: DEBUG nova.virt.hardware [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1595.599701] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be2d97fa-06f7-47e4-b97f-32444a44fa24 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.607721] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55070187-13aa-41fa-bf7b-c0e78baf31f0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.934575] env[67977]: DEBUG nova.network.neutron [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Successfully created port: 35a65301-0ec4-4a9a-b990-32694073be8e {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1596.840221] env[67977]: DEBUG nova.network.neutron [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Successfully updated port: 35a65301-0ec4-4a9a-b990-32694073be8e {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1596.856747] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "refresh_cache-8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1596.856747] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquired lock "refresh_cache-8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1596.856747] env[67977]: DEBUG nova.network.neutron [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1596.895916] env[67977]: DEBUG nova.network.neutron [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1597.079269] env[67977]: DEBUG nova.compute.manager [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Received event network-vif-plugged-35a65301-0ec4-4a9a-b990-32694073be8e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1597.079486] env[67977]: DEBUG oslo_concurrency.lockutils [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] Acquiring lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1597.079688] env[67977]: DEBUG oslo_concurrency.lockutils [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1597.079853] env[67977]: DEBUG oslo_concurrency.lockutils [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.081367] env[67977]: DEBUG nova.compute.manager [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] No waiting events found dispatching network-vif-plugged-35a65301-0ec4-4a9a-b990-32694073be8e {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1597.081618] env[67977]: WARNING nova.compute.manager [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Received unexpected event network-vif-plugged-35a65301-0ec4-4a9a-b990-32694073be8e for instance with vm_state building and task_state spawning. [ 1597.081796] env[67977]: DEBUG nova.compute.manager [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Received event network-changed-35a65301-0ec4-4a9a-b990-32694073be8e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1597.082028] env[67977]: DEBUG nova.compute.manager [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Refreshing instance network info cache due to event network-changed-35a65301-0ec4-4a9a-b990-32694073be8e. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1597.082232] env[67977]: DEBUG oslo_concurrency.lockutils [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] Acquiring lock "refresh_cache-8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1597.107763] env[67977]: DEBUG nova.network.neutron [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Updating instance_info_cache with network_info: [{"id": "35a65301-0ec4-4a9a-b990-32694073be8e", "address": "fa:16:3e:65:46:59", "network": {"id": "444f19a5-c228-4ca2-ab8f-91fb58200775", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1364974740-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb5c71d2daaa48f09f9f32a17b9d41c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b033f4d-2e92-4702-add6-410a29d3f251", "external-id": "nsx-vlan-transportzone-649", "segmentation_id": 649, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35a65301-0e", "ovs_interfaceid": "35a65301-0ec4-4a9a-b990-32694073be8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1597.117933] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Releasing lock "refresh_cache-8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1597.118295] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Instance network_info: |[{"id": "35a65301-0ec4-4a9a-b990-32694073be8e", "address": "fa:16:3e:65:46:59", "network": {"id": "444f19a5-c228-4ca2-ab8f-91fb58200775", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1364974740-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb5c71d2daaa48f09f9f32a17b9d41c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b033f4d-2e92-4702-add6-410a29d3f251", "external-id": "nsx-vlan-transportzone-649", "segmentation_id": 649, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35a65301-0e", "ovs_interfaceid": "35a65301-0ec4-4a9a-b990-32694073be8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1597.118577] env[67977]: DEBUG oslo_concurrency.lockutils [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] Acquired lock "refresh_cache-8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1597.118764] env[67977]: DEBUG nova.network.neutron [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Refreshing network info cache for port 35a65301-0ec4-4a9a-b990-32694073be8e {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1597.119751] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:65:46:59', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b033f4d-2e92-4702-add6-410a29d3f251', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '35a65301-0ec4-4a9a-b990-32694073be8e', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1597.127280] env[67977]: DEBUG oslo.service.loopingcall [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1597.128039] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1597.130242] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f1324ac1-394b-4119-ad9e-6febe6224bed {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.152044] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1597.152044] env[67977]: value = "task-3468249" [ 1597.152044] env[67977]: _type = "Task" [ 1597.152044] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1597.160110] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468249, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1597.486347] env[67977]: DEBUG nova.network.neutron [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Updated VIF entry in instance network info cache for port 35a65301-0ec4-4a9a-b990-32694073be8e. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1597.486757] env[67977]: DEBUG nova.network.neutron [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Updating instance_info_cache with network_info: [{"id": "35a65301-0ec4-4a9a-b990-32694073be8e", "address": "fa:16:3e:65:46:59", "network": {"id": "444f19a5-c228-4ca2-ab8f-91fb58200775", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1364974740-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb5c71d2daaa48f09f9f32a17b9d41c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b033f4d-2e92-4702-add6-410a29d3f251", "external-id": "nsx-vlan-transportzone-649", "segmentation_id": 649, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35a65301-0e", "ovs_interfaceid": "35a65301-0ec4-4a9a-b990-32694073be8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1597.496439] env[67977]: DEBUG oslo_concurrency.lockutils [req-c83c6823-3484-4a41-b72c-139cb05ec004 req-b4033d70-6401-4354-9bc5-fe87a5186383 service nova] Releasing lock "refresh_cache-8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1597.661638] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468249, 'name': CreateVM_Task, 'duration_secs': 0.286718} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1597.661826] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1597.662510] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1597.662685] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1597.663024] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1597.663290] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a151eeb3-3819-40fb-a0f4-1934372972d3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.667738] env[67977]: DEBUG oslo_vmware.api [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for the task: (returnval){ [ 1597.667738] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5269003f-486b-a36b-b5bb-69a3487d698c" [ 1597.667738] env[67977]: _type = "Task" [ 1597.667738] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1597.675224] env[67977]: DEBUG oslo_vmware.api [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5269003f-486b-a36b-b5bb-69a3487d698c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1598.178417] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1598.178776] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1598.178878] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1608.120629] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1608.120969] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1636.784871] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1636.785246] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1637.771705] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1637.775501] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1637.775501] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1638.771343] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1638.795474] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1638.795673] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1638.808480] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1638.808698] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1638.808860] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1638.809035] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1638.810130] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62dff8a6-b3c9-4f9d-9343-d7597394da7d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1638.819179] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f32e01a8-35d8-40df-8516-ca9ccf2f10d9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1638.832819] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72b02fdd-ab87-4818-a540-d86620aa356f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1638.838862] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70781a20-910a-40ef-bae0-f9b2629ec430 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1638.867773] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180926MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1638.867925] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1638.868173] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1638.937034] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.937227] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8f1440c5-e712-4635-9f02-f9cda12da693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.937370] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.937493] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.937611] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.937730] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.937845] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6fae5126-6618-4337-9a52-d6019727e0b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.937960] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.938093] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.938211] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1638.949047] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1638.958906] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1638.968091] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1638.968316] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1638.968465] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1639.110230] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e2f72c2-b429-4c6b-8da2-a08e5bd2ac02 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1639.117621] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aeee1ca2-b0f9-4c38-8866-d24218dae158 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1639.148137] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48c84d7d-8ebe-48f1-bd05-9c3b08c50a58 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1639.154982] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7235bc8-c4c6-4d9a-8c93-bcabb0efa563 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1639.167495] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1639.175911] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1639.188767] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1639.188986] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.321s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1640.168683] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1640.775066] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1642.398045] env[67977]: WARNING oslo_vmware.rw_handles [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1642.398045] env[67977]: ERROR oslo_vmware.rw_handles [ 1642.398690] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/a22ce218-0b83-4ab1-b368-2f770265dca8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1642.400664] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1642.400917] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Copying Virtual Disk [datastore1] vmware_temp/a22ce218-0b83-4ab1-b368-2f770265dca8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/a22ce218-0b83-4ab1-b368-2f770265dca8/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1642.401262] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ade47579-46af-4cdc-887e-f419341821f9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1642.409645] env[67977]: DEBUG oslo_vmware.api [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Waiting for the task: (returnval){ [ 1642.409645] env[67977]: value = "task-3468250" [ 1642.409645] env[67977]: _type = "Task" [ 1642.409645] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1642.417283] env[67977]: DEBUG oslo_vmware.api [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Task: {'id': task-3468250, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1642.919886] env[67977]: DEBUG oslo_vmware.exceptions [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1642.920278] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1642.920887] env[67977]: ERROR nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1642.920887] env[67977]: Faults: ['InvalidArgument'] [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Traceback (most recent call last): [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] yield resources [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] self.driver.spawn(context, instance, image_meta, [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] self._fetch_image_if_missing(context, vi) [ 1642.920887] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] image_cache(vi, tmp_image_ds_loc) [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] vm_util.copy_virtual_disk( [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] session._wait_for_task(vmdk_copy_task) [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] return self.wait_for_task(task_ref) [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] return evt.wait() [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] result = hub.switch() [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1642.921248] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] return self.greenlet.switch() [ 1642.921615] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1642.921615] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] self.f(*self.args, **self.kw) [ 1642.921615] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1642.921615] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] raise exceptions.translate_fault(task_info.error) [ 1642.921615] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1642.921615] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Faults: ['InvalidArgument'] [ 1642.921615] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] [ 1642.921615] env[67977]: INFO nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Terminating instance [ 1642.922730] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1642.922935] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1642.923207] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4722c90b-4e26-4d41-8abc-8764e332dd25 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1642.925351] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1642.925542] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1642.926277] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-861478e5-8332-460c-9af3-7700eea38a0a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1642.933121] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1642.933360] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-70935951-d814-4a2f-9952-8611f1ffc9ab {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1642.935469] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1642.935644] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1642.936636] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-178e0d0e-c88a-4b51-9f40-a711eddde7a2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1642.941262] env[67977]: DEBUG oslo_vmware.api [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for the task: (returnval){ [ 1642.941262] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523ec1a9-94e5-73fe-819c-78596217d85c" [ 1642.941262] env[67977]: _type = "Task" [ 1642.941262] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1642.950167] env[67977]: DEBUG oslo_vmware.api [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523ec1a9-94e5-73fe-819c-78596217d85c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1643.007566] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1643.007779] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1643.007960] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Deleting the datastore file [datastore1] 27743458-4ef0-4ceb-a0bf-cac219dbdc35 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1643.008249] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3105627d-79c9-40d2-8cec-84e9e603100f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.014056] env[67977]: DEBUG oslo_vmware.api [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Waiting for the task: (returnval){ [ 1643.014056] env[67977]: value = "task-3468252" [ 1643.014056] env[67977]: _type = "Task" [ 1643.014056] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1643.022827] env[67977]: DEBUG oslo_vmware.api [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Task: {'id': task-3468252, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1643.451521] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1643.451863] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Creating directory with path [datastore1] vmware_temp/3e866671-283e-40a0-962b-0f63e8f760bd/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1643.451940] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f0d5277e-0377-414b-850c-d7462f7afc89 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.463110] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Created directory with path [datastore1] vmware_temp/3e866671-283e-40a0-962b-0f63e8f760bd/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1643.463296] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Fetch image to [datastore1] vmware_temp/3e866671-283e-40a0-962b-0f63e8f760bd/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1643.463470] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/3e866671-283e-40a0-962b-0f63e8f760bd/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1643.464226] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be949182-e287-46a4-bdc3-5afe680fea36 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.470611] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ecda3c1-6a27-474f-afe2-98548f521922 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.479324] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4dc7f0d-ba3f-414a-8973-12ed289bfd3f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.509219] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ccd2183-55f1-4e7e-9fe7-02907d85c78a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.514344] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b5c8bb77-58a0-4d2f-a910-29e2d7d2cf1d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.523358] env[67977]: DEBUG oslo_vmware.api [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Task: {'id': task-3468252, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063197} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1643.523582] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1643.523761] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1643.523930] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1643.524122] env[67977]: INFO nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1643.526203] env[67977]: DEBUG nova.compute.claims [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1643.526377] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1643.526590] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1643.536248] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1643.587457] env[67977]: DEBUG oslo_vmware.rw_handles [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3e866671-283e-40a0-962b-0f63e8f760bd/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1643.645456] env[67977]: DEBUG oslo_vmware.rw_handles [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1643.645644] env[67977]: DEBUG oslo_vmware.rw_handles [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3e866671-283e-40a0-962b-0f63e8f760bd/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1643.765599] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccd82dfe-dbfc-47b4-99d2-903faf965375 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.773417] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60554a50-e953-4417-ab23-7f6d25855c3f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.804747] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8527d331-7f48-42f4-9ad3-9c4147114434 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.813025] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b96daab-066f-462e-8c4d-6b9128c1580d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1643.825457] env[67977]: DEBUG nova.compute.provider_tree [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1643.833668] env[67977]: DEBUG nova.scheduler.client.report [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1643.847970] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.321s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1643.848516] env[67977]: ERROR nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1643.848516] env[67977]: Faults: ['InvalidArgument'] [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Traceback (most recent call last): [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] self.driver.spawn(context, instance, image_meta, [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] self._fetch_image_if_missing(context, vi) [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] image_cache(vi, tmp_image_ds_loc) [ 1643.848516] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] vm_util.copy_virtual_disk( [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] session._wait_for_task(vmdk_copy_task) [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] return self.wait_for_task(task_ref) [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] return evt.wait() [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] result = hub.switch() [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] return self.greenlet.switch() [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1643.849269] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] self.f(*self.args, **self.kw) [ 1643.849589] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1643.849589] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] raise exceptions.translate_fault(task_info.error) [ 1643.849589] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1643.849589] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Faults: ['InvalidArgument'] [ 1643.849589] env[67977]: ERROR nova.compute.manager [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] [ 1643.849589] env[67977]: DEBUG nova.compute.utils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1643.850673] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Build of instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 was re-scheduled: A specified parameter was not correct: fileType [ 1643.850673] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1643.851058] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1643.851238] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1643.851409] env[67977]: DEBUG nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1643.851570] env[67977]: DEBUG nova.network.neutron [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1644.487311] env[67977]: DEBUG nova.network.neutron [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1644.499379] env[67977]: INFO nova.compute.manager [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Took 0.65 seconds to deallocate network for instance. [ 1644.605588] env[67977]: INFO nova.scheduler.client.report [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Deleted allocations for instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 [ 1644.626040] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e876e493-9695-4571-beaa-a028618b153d tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.045s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1644.628959] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.435s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1644.628959] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Acquiring lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1644.628959] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1644.629210] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1644.630098] env[67977]: INFO nova.compute.manager [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Terminating instance [ 1644.632155] env[67977]: DEBUG nova.compute.manager [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1644.632356] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1644.632616] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d4f5221e-5046-4296-b575-0c03a810d4d4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.638807] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1644.645905] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-037df3e1-b5f0-4ea9-b692-0cbdb5fc8eff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.677351] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 27743458-4ef0-4ceb-a0bf-cac219dbdc35 could not be found. [ 1644.677568] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1644.677746] env[67977]: INFO nova.compute.manager [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1644.677984] env[67977]: DEBUG oslo.service.loopingcall [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1644.682519] env[67977]: DEBUG nova.compute.manager [-] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1644.682628] env[67977]: DEBUG nova.network.neutron [-] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1644.694160] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1644.694398] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1644.695952] env[67977]: INFO nova.compute.claims [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1644.707867] env[67977]: DEBUG nova.network.neutron [-] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1644.725232] env[67977]: INFO nova.compute.manager [-] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] Took 0.04 seconds to deallocate network for instance. [ 1644.809775] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9bc5c1b2-9dfc-4f01-bc73-eba901235e62 tempest-ServerActionsTestOtherB-606807175 tempest-ServerActionsTestOtherB-606807175-project-member] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1644.810103] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 346.083s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1644.810437] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 27743458-4ef0-4ceb-a0bf-cac219dbdc35] During sync_power_state the instance has a pending task (deleting). Skip. [ 1644.810576] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "27743458-4ef0-4ceb-a0bf-cac219dbdc35" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1644.894807] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-282c9a78-e846-4452-b84e-8560a8db29ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.903213] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1ff593d-2098-4a67-a7a9-33dcb6e7b3df {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.931869] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d8e0557-148c-4428-aa0f-4346f8481d51 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.939030] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b18b3b0-f34a-472a-9f50-d120dbbeda57 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.951691] env[67977]: DEBUG nova.compute.provider_tree [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1644.960499] env[67977]: DEBUG nova.scheduler.client.report [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1644.974383] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1644.974937] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1645.011869] env[67977]: DEBUG nova.compute.utils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1645.013280] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1645.013484] env[67977]: DEBUG nova.network.neutron [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1645.021376] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1645.071707] env[67977]: DEBUG nova.policy [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4b11845d19f949ec9f5011fb32430517', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c693200ece7542d1b51db597c96768eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1645.090768] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1645.115989] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1645.116263] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1645.116436] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1645.116620] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1645.116768] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1645.116992] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1645.117135] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1645.117300] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1645.117466] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1645.117627] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1645.117796] env[67977]: DEBUG nova.virt.hardware [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1645.118745] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-374e154c-67c0-46a4-ba4e-e49f9def5edd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.127105] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f8ce26c-145c-46ef-98df-57e89c7edd64 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.393482] env[67977]: DEBUG nova.network.neutron [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Successfully created port: 612cd4d6-8356-481d-baba-f7c335ab8340 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1645.776130] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1645.776439] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1645.776439] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1645.797810] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.797953] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.798192] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.798359] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.798489] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.798613] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.798734] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.798879] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.798963] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.799096] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1645.799220] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1646.013942] env[67977]: DEBUG nova.network.neutron [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Successfully updated port: 612cd4d6-8356-481d-baba-f7c335ab8340 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1646.029252] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "refresh_cache-d1fc2ae5-fa11-41a7-808b-13da16667078" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1646.029398] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquired lock "refresh_cache-d1fc2ae5-fa11-41a7-808b-13da16667078" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1646.029548] env[67977]: DEBUG nova.network.neutron [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1646.074305] env[67977]: DEBUG nova.network.neutron [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1646.306445] env[67977]: DEBUG nova.network.neutron [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Updating instance_info_cache with network_info: [{"id": "612cd4d6-8356-481d-baba-f7c335ab8340", "address": "fa:16:3e:b4:45:ff", "network": {"id": "cbd12706-d748-47e2-84b6-d528ee8d4a61", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-955781397-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c693200ece7542d1b51db597c96768eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "680cb499-2a47-482b-af0d-112016ac0e17", "external-id": "nsx-vlan-transportzone-644", "segmentation_id": 644, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap612cd4d6-83", "ovs_interfaceid": "612cd4d6-8356-481d-baba-f7c335ab8340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1646.322168] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Releasing lock "refresh_cache-d1fc2ae5-fa11-41a7-808b-13da16667078" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1646.322475] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Instance network_info: |[{"id": "612cd4d6-8356-481d-baba-f7c335ab8340", "address": "fa:16:3e:b4:45:ff", "network": {"id": "cbd12706-d748-47e2-84b6-d528ee8d4a61", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-955781397-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c693200ece7542d1b51db597c96768eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "680cb499-2a47-482b-af0d-112016ac0e17", "external-id": "nsx-vlan-transportzone-644", "segmentation_id": 644, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap612cd4d6-83", "ovs_interfaceid": "612cd4d6-8356-481d-baba-f7c335ab8340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1646.322872] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b4:45:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '680cb499-2a47-482b-af0d-112016ac0e17', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '612cd4d6-8356-481d-baba-f7c335ab8340', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1646.330257] env[67977]: DEBUG oslo.service.loopingcall [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1646.330899] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1646.331294] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3e837002-5e17-4cc9-948e-1df4594e8cc8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.351275] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1646.351275] env[67977]: value = "task-3468253" [ 1646.351275] env[67977]: _type = "Task" [ 1646.351275] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1646.359130] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468253, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1646.534569] env[67977]: DEBUG nova.compute.manager [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Received event network-vif-plugged-612cd4d6-8356-481d-baba-f7c335ab8340 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1646.534821] env[67977]: DEBUG oslo_concurrency.lockutils [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] Acquiring lock "d1fc2ae5-fa11-41a7-808b-13da16667078-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1646.535094] env[67977]: DEBUG oslo_concurrency.lockutils [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1646.535303] env[67977]: DEBUG oslo_concurrency.lockutils [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1646.535526] env[67977]: DEBUG nova.compute.manager [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] No waiting events found dispatching network-vif-plugged-612cd4d6-8356-481d-baba-f7c335ab8340 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1646.535738] env[67977]: WARNING nova.compute.manager [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Received unexpected event network-vif-plugged-612cd4d6-8356-481d-baba-f7c335ab8340 for instance with vm_state building and task_state spawning. [ 1646.535927] env[67977]: DEBUG nova.compute.manager [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Received event network-changed-612cd4d6-8356-481d-baba-f7c335ab8340 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1646.536135] env[67977]: DEBUG nova.compute.manager [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Refreshing instance network info cache due to event network-changed-612cd4d6-8356-481d-baba-f7c335ab8340. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1646.536377] env[67977]: DEBUG oslo_concurrency.lockutils [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] Acquiring lock "refresh_cache-d1fc2ae5-fa11-41a7-808b-13da16667078" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1646.536541] env[67977]: DEBUG oslo_concurrency.lockutils [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] Acquired lock "refresh_cache-d1fc2ae5-fa11-41a7-808b-13da16667078" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1646.536742] env[67977]: DEBUG nova.network.neutron [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Refreshing network info cache for port 612cd4d6-8356-481d-baba-f7c335ab8340 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1646.848929] env[67977]: DEBUG nova.network.neutron [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Updated VIF entry in instance network info cache for port 612cd4d6-8356-481d-baba-f7c335ab8340. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1646.849319] env[67977]: DEBUG nova.network.neutron [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Updating instance_info_cache with network_info: [{"id": "612cd4d6-8356-481d-baba-f7c335ab8340", "address": "fa:16:3e:b4:45:ff", "network": {"id": "cbd12706-d748-47e2-84b6-d528ee8d4a61", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-955781397-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c693200ece7542d1b51db597c96768eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "680cb499-2a47-482b-af0d-112016ac0e17", "external-id": "nsx-vlan-transportzone-644", "segmentation_id": 644, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap612cd4d6-83", "ovs_interfaceid": "612cd4d6-8356-481d-baba-f7c335ab8340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1646.861904] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468253, 'name': CreateVM_Task, 'duration_secs': 0.330391} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1646.862780] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1646.863360] env[67977]: DEBUG oslo_concurrency.lockutils [req-2ce76d47-0b9a-4eaf-a0ea-ddec1c10bee6 req-97836259-9965-4fed-916f-1cc291bfc871 service nova] Releasing lock "refresh_cache-d1fc2ae5-fa11-41a7-808b-13da16667078" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1646.864238] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1646.864403] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1646.864731] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1646.865346] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f2b89e9-cccd-4c02-84cb-5a29caeff556 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.870552] env[67977]: DEBUG oslo_vmware.api [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for the task: (returnval){ [ 1646.870552] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52b67ca1-a4b4-687f-f6d0-d46ee269a8ed" [ 1646.870552] env[67977]: _type = "Task" [ 1646.870552] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1646.881904] env[67977]: DEBUG oslo_vmware.api [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52b67ca1-a4b4-687f-f6d0-d46ee269a8ed, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1647.384587] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1647.385303] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1647.385303] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1660.059712] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1692.572578] env[67977]: WARNING oslo_vmware.rw_handles [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1692.572578] env[67977]: ERROR oslo_vmware.rw_handles [ 1692.573536] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/3e866671-283e-40a0-962b-0f63e8f760bd/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1692.575124] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1692.575396] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Copying Virtual Disk [datastore1] vmware_temp/3e866671-283e-40a0-962b-0f63e8f760bd/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/3e866671-283e-40a0-962b-0f63e8f760bd/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1692.575705] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3de66ad3-0dbf-4679-881f-8c41ca375715 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1692.583839] env[67977]: DEBUG oslo_vmware.api [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for the task: (returnval){ [ 1692.583839] env[67977]: value = "task-3468254" [ 1692.583839] env[67977]: _type = "Task" [ 1692.583839] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1692.593488] env[67977]: DEBUG oslo_vmware.api [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': task-3468254, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1693.094393] env[67977]: DEBUG oslo_vmware.exceptions [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1693.094642] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1693.095206] env[67977]: ERROR nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1693.095206] env[67977]: Faults: ['InvalidArgument'] [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Traceback (most recent call last): [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] yield resources [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] self.driver.spawn(context, instance, image_meta, [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] self._fetch_image_if_missing(context, vi) [ 1693.095206] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] image_cache(vi, tmp_image_ds_loc) [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] vm_util.copy_virtual_disk( [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] session._wait_for_task(vmdk_copy_task) [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] return self.wait_for_task(task_ref) [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] return evt.wait() [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] result = hub.switch() [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1693.095876] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] return self.greenlet.switch() [ 1693.096481] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1693.096481] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] self.f(*self.args, **self.kw) [ 1693.096481] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1693.096481] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] raise exceptions.translate_fault(task_info.error) [ 1693.096481] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1693.096481] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Faults: ['InvalidArgument'] [ 1693.096481] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] [ 1693.096481] env[67977]: INFO nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Terminating instance [ 1693.097126] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1693.097348] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1693.097953] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1693.098157] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1693.098375] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-94e41056-953e-4f93-beee-8d640dc217d2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.100795] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d59e0076-87ee-4a6a-aecc-5ce41e70d8be {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.107210] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1693.107420] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d59ac295-4ccc-440a-b682-620e9b9996b3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.109350] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1693.109523] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1693.110450] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ecd80018-27ea-4f50-bdd8-00d1a5975ee6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.114961] env[67977]: DEBUG oslo_vmware.api [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for the task: (returnval){ [ 1693.114961] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ddef9b-2686-ecb7-fbf5-7f6d1f447584" [ 1693.114961] env[67977]: _type = "Task" [ 1693.114961] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1693.124080] env[67977]: DEBUG oslo_vmware.api [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ddef9b-2686-ecb7-fbf5-7f6d1f447584, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1693.180819] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1693.181055] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1693.181285] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Deleting the datastore file [datastore1] 8f1440c5-e712-4635-9f02-f9cda12da693 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1693.181548] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-775248fb-32bc-4cd2-ace4-e5d6826f7686 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.189291] env[67977]: DEBUG oslo_vmware.api [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for the task: (returnval){ [ 1693.189291] env[67977]: value = "task-3468256" [ 1693.189291] env[67977]: _type = "Task" [ 1693.189291] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1693.197482] env[67977]: DEBUG oslo_vmware.api [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': task-3468256, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1693.625532] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1693.625845] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Creating directory with path [datastore1] vmware_temp/adc7c54c-2d23-4925-bd8b-795c6349625a/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1693.626157] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-50e03d22-767c-4331-9e43-c0dcfb5c2729 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.636987] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Created directory with path [datastore1] vmware_temp/adc7c54c-2d23-4925-bd8b-795c6349625a/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1693.637257] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Fetch image to [datastore1] vmware_temp/adc7c54c-2d23-4925-bd8b-795c6349625a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1693.637488] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/adc7c54c-2d23-4925-bd8b-795c6349625a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1693.638249] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3454b9d5-a421-4729-81ac-af3fe2ee05ce {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.646301] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bc84cdb-8902-45d6-8553-3b3de32b4de4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.655102] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4a31b45-0c97-4d66-b428-e26614ebb5e1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.686133] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99e6027c-c961-4167-baad-bbe9d673ad2b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.694909] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-40093203-6111-48ff-8853-b9b1da4b3321 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.699236] env[67977]: DEBUG oslo_vmware.api [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': task-3468256, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070762} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1693.699800] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1693.699990] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1693.700184] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1693.700361] env[67977]: INFO nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1693.702407] env[67977]: DEBUG nova.compute.claims [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1693.702578] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1693.702791] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1693.720841] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1693.777688] env[67977]: DEBUG oslo_vmware.rw_handles [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/adc7c54c-2d23-4925-bd8b-795c6349625a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1693.838035] env[67977]: DEBUG oslo_vmware.rw_handles [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1693.838035] env[67977]: DEBUG oslo_vmware.rw_handles [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/adc7c54c-2d23-4925-bd8b-795c6349625a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1693.964363] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3723dee7-889d-45c5-8943-bdec4e873654 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1693.972352] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a97e6f40-408e-4774-9a42-410c846267d4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.002803] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e84da15-b054-4fab-bc19-8057de317b88 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.010109] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e46417e-2fc5-471d-9a53-4770f9081d9f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.023052] env[67977]: DEBUG nova.compute.provider_tree [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1694.033990] env[67977]: DEBUG nova.scheduler.client.report [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1694.047289] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.344s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1694.047821] env[67977]: ERROR nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1694.047821] env[67977]: Faults: ['InvalidArgument'] [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Traceback (most recent call last): [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] self.driver.spawn(context, instance, image_meta, [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] self._fetch_image_if_missing(context, vi) [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] image_cache(vi, tmp_image_ds_loc) [ 1694.047821] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] vm_util.copy_virtual_disk( [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] session._wait_for_task(vmdk_copy_task) [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] return self.wait_for_task(task_ref) [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] return evt.wait() [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] result = hub.switch() [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] return self.greenlet.switch() [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1694.048189] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] self.f(*self.args, **self.kw) [ 1694.048580] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1694.048580] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] raise exceptions.translate_fault(task_info.error) [ 1694.048580] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1694.048580] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Faults: ['InvalidArgument'] [ 1694.048580] env[67977]: ERROR nova.compute.manager [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] [ 1694.048580] env[67977]: DEBUG nova.compute.utils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1694.049854] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Build of instance 8f1440c5-e712-4635-9f02-f9cda12da693 was re-scheduled: A specified parameter was not correct: fileType [ 1694.049854] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1694.050244] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1694.050423] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1694.050595] env[67977]: DEBUG nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1694.050761] env[67977]: DEBUG nova.network.neutron [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1694.355846] env[67977]: DEBUG nova.network.neutron [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1694.369161] env[67977]: INFO nova.compute.manager [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Took 0.32 seconds to deallocate network for instance. [ 1694.466282] env[67977]: INFO nova.scheduler.client.report [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Deleted allocations for instance 8f1440c5-e712-4635-9f02-f9cda12da693 [ 1694.487423] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fcdc8383-139b-4e3c-a62f-73a4f96c4f85 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8f1440c5-e712-4635-9f02-f9cda12da693" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 624.570s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1694.488549] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8f1440c5-e712-4635-9f02-f9cda12da693" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.818s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1694.488773] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "8f1440c5-e712-4635-9f02-f9cda12da693-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1694.488976] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8f1440c5-e712-4635-9f02-f9cda12da693-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1694.489165] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8f1440c5-e712-4635-9f02-f9cda12da693-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1694.491278] env[67977]: INFO nova.compute.manager [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Terminating instance [ 1694.492878] env[67977]: DEBUG nova.compute.manager [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1694.493088] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1694.493547] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ded2ece0-fa25-4788-9179-4e3b3c56c773 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.502796] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccf1e482-d2c3-4d6b-bf9a-87f2a2cf7e4c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.513593] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1694.535574] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8f1440c5-e712-4635-9f02-f9cda12da693 could not be found. [ 1694.535786] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1694.535963] env[67977]: INFO nova.compute.manager [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1694.536338] env[67977]: DEBUG oslo.service.loopingcall [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1694.536571] env[67977]: DEBUG nova.compute.manager [-] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1694.536668] env[67977]: DEBUG nova.network.neutron [-] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1694.559539] env[67977]: DEBUG nova.network.neutron [-] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1694.562762] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1694.562992] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1694.564459] env[67977]: INFO nova.compute.claims [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1694.568534] env[67977]: INFO nova.compute.manager [-] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] Took 0.03 seconds to deallocate network for instance. [ 1694.664139] env[67977]: DEBUG oslo_concurrency.lockutils [None req-f6f9b0fb-0b33-476e-b6c1-dd0918c3466e tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8f1440c5-e712-4635-9f02-f9cda12da693" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1694.664995] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "8f1440c5-e712-4635-9f02-f9cda12da693" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 395.938s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1694.665201] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8f1440c5-e712-4635-9f02-f9cda12da693] During sync_power_state the instance has a pending task (deleting). Skip. [ 1694.665383] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "8f1440c5-e712-4635-9f02-f9cda12da693" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1694.743520] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07dc8faf-0e12-45a6-a8f8-845fde35a05e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.751508] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6d62b1c-4b91-4905-a892-aff9f1f3a9f9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.780967] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39c4f501-9510-4b04-9a04-d679f7a9a8d6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.788235] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8ae2dcc-1391-4ae7-8dc6-c8160e630887 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.800825] env[67977]: DEBUG nova.compute.provider_tree [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1694.809293] env[67977]: DEBUG nova.scheduler.client.report [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1694.822243] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1694.822775] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1694.853090] env[67977]: DEBUG nova.compute.utils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1694.854531] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1694.854630] env[67977]: DEBUG nova.network.neutron [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1694.863642] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1694.914774] env[67977]: DEBUG nova.policy [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78df84566c65469890b3b6f15f3e5e01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ff581ae563e45108f497cade6990d79', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1694.935881] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1694.962605] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1694.962857] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1694.963019] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1694.963239] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1694.963359] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1694.963503] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1694.963708] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1694.963865] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1694.964190] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1694.964425] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1694.964642] env[67977]: DEBUG nova.virt.hardware [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1694.965541] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6aaba75-c585-4822-99a2-7bbbdb319513 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1694.974023] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3decdb6-301e-4041-b9ec-e1e6a62ef016 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1695.314351] env[67977]: DEBUG nova.network.neutron [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Successfully created port: 811f8c79-f78c-4d4c-963b-0c6d6da418e3 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1695.938464] env[67977]: DEBUG nova.network.neutron [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Successfully updated port: 811f8c79-f78c-4d4c-963b-0c6d6da418e3 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1695.950695] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "refresh_cache-511896d4-d9cb-42e0-b213-31be3cac191c" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1695.950858] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "refresh_cache-511896d4-d9cb-42e0-b213-31be3cac191c" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1695.951028] env[67977]: DEBUG nova.network.neutron [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1695.989396] env[67977]: DEBUG nova.network.neutron [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1696.236056] env[67977]: DEBUG nova.network.neutron [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Updating instance_info_cache with network_info: [{"id": "811f8c79-f78c-4d4c-963b-0c6d6da418e3", "address": "fa:16:3e:d3:c1:5f", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap811f8c79-f7", "ovs_interfaceid": "811f8c79-f78c-4d4c-963b-0c6d6da418e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1696.254242] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "refresh_cache-511896d4-d9cb-42e0-b213-31be3cac191c" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1696.254242] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Instance network_info: |[{"id": "811f8c79-f78c-4d4c-963b-0c6d6da418e3", "address": "fa:16:3e:d3:c1:5f", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap811f8c79-f7", "ovs_interfaceid": "811f8c79-f78c-4d4c-963b-0c6d6da418e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1696.254367] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d3:c1:5f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5efce30e-48dd-493a-a354-f562a8adf7af', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '811f8c79-f78c-4d4c-963b-0c6d6da418e3', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1696.262531] env[67977]: DEBUG oslo.service.loopingcall [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1696.263255] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1696.263878] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-04e427b6-5ad7-4600-8275-5c06a5d0fe06 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.289367] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1696.289367] env[67977]: value = "task-3468257" [ 1696.289367] env[67977]: _type = "Task" [ 1696.289367] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1696.298491] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468257, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1696.393544] env[67977]: DEBUG nova.compute.manager [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Received event network-vif-plugged-811f8c79-f78c-4d4c-963b-0c6d6da418e3 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1696.393767] env[67977]: DEBUG oslo_concurrency.lockutils [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] Acquiring lock "511896d4-d9cb-42e0-b213-31be3cac191c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1696.394010] env[67977]: DEBUG oslo_concurrency.lockutils [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] Lock "511896d4-d9cb-42e0-b213-31be3cac191c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1696.394231] env[67977]: DEBUG oslo_concurrency.lockutils [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] Lock "511896d4-d9cb-42e0-b213-31be3cac191c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1696.394426] env[67977]: DEBUG nova.compute.manager [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] No waiting events found dispatching network-vif-plugged-811f8c79-f78c-4d4c-963b-0c6d6da418e3 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1696.394595] env[67977]: WARNING nova.compute.manager [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Received unexpected event network-vif-plugged-811f8c79-f78c-4d4c-963b-0c6d6da418e3 for instance with vm_state building and task_state spawning. [ 1696.394755] env[67977]: DEBUG nova.compute.manager [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Received event network-changed-811f8c79-f78c-4d4c-963b-0c6d6da418e3 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1696.394912] env[67977]: DEBUG nova.compute.manager [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Refreshing instance network info cache due to event network-changed-811f8c79-f78c-4d4c-963b-0c6d6da418e3. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1696.395112] env[67977]: DEBUG oslo_concurrency.lockutils [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] Acquiring lock "refresh_cache-511896d4-d9cb-42e0-b213-31be3cac191c" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1696.395253] env[67977]: DEBUG oslo_concurrency.lockutils [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] Acquired lock "refresh_cache-511896d4-d9cb-42e0-b213-31be3cac191c" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1696.395412] env[67977]: DEBUG nova.network.neutron [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Refreshing network info cache for port 811f8c79-f78c-4d4c-963b-0c6d6da418e3 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1696.630774] env[67977]: DEBUG nova.network.neutron [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Updated VIF entry in instance network info cache for port 811f8c79-f78c-4d4c-963b-0c6d6da418e3. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1696.631153] env[67977]: DEBUG nova.network.neutron [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Updating instance_info_cache with network_info: [{"id": "811f8c79-f78c-4d4c-963b-0c6d6da418e3", "address": "fa:16:3e:d3:c1:5f", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap811f8c79-f7", "ovs_interfaceid": "811f8c79-f78c-4d4c-963b-0c6d6da418e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1696.662407] env[67977]: DEBUG oslo_concurrency.lockutils [req-2b468737-f0aa-4148-88dc-0b6dd06fb71d req-60bd2a85-45c2-479c-906d-7778b98b1d2b service nova] Releasing lock "refresh_cache-511896d4-d9cb-42e0-b213-31be3cac191c" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1696.799984] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468257, 'name': CreateVM_Task, 'duration_secs': 0.271406} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1696.800144] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1696.801074] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1696.801287] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1696.801612] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1696.801855] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c83169b3-9100-4492-83b9-cc6709e718cb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.806036] env[67977]: DEBUG oslo_vmware.api [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 1696.806036] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d31424-87d0-4df6-582c-c704de222392" [ 1696.806036] env[67977]: _type = "Task" [ 1696.806036] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1696.813449] env[67977]: DEBUG oslo_vmware.api [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d31424-87d0-4df6-582c-c704de222392, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1697.316761] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1697.317060] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1697.317257] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1697.775296] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1698.774922] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1698.775283] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1698.775335] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1699.770650] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1699.775870] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1699.775870] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1699.786369] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1699.786684] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1699.786740] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1699.786899] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1699.788017] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe1d541a-8d76-4788-9d5e-f3ad17412dd9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.798102] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79d4145b-cd6c-45ba-aab3-c84d573c6bdf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.812052] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dfe3d8a-a6a3-47cb-8d94-46c988708f56 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.818354] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b5c5cd4-ca45-4a63-b637-53a02c965de8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.852593] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180919MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1699.852813] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1699.853103] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1699.932564] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance f03fe248-75df-4237-a6dd-cc49012c2331 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.932727] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.932859] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.932983] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.933119] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6fae5126-6618-4337-9a52-d6019727e0b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.933242] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.933363] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.933479] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.933592] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.933705] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1699.945267] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1699.945498] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1699.945650] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1700.086638] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-330a6233-a2be-454f-9ead-1f733e0aed98 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.094600] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3be164ca-f188-4eaa-a92e-66e15376c3e9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.125674] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e2a7e0e-49da-4e9a-aa28-3281b6b74e93 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.132610] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e65aba4b-d019-4373-a237-c0574d1093d3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.145264] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1700.153500] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1700.166546] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1700.166736] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.314s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1701.167205] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1701.167542] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1705.593668] env[67977]: DEBUG oslo_concurrency.lockutils [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "d1fc2ae5-fa11-41a7-808b-13da16667078" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1705.776161] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1705.776339] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1705.776518] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1705.796733] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.796910] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.797053] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.797151] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.797328] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.797487] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.797555] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.797613] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.797734] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.797851] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1705.797966] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1743.189428] env[67977]: WARNING oslo_vmware.rw_handles [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1743.189428] env[67977]: ERROR oslo_vmware.rw_handles [ 1743.190039] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/adc7c54c-2d23-4925-bd8b-795c6349625a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1743.192154] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1743.192530] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Copying Virtual Disk [datastore1] vmware_temp/adc7c54c-2d23-4925-bd8b-795c6349625a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/adc7c54c-2d23-4925-bd8b-795c6349625a/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1743.192857] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ee494059-88a3-4bb5-82c1-6e9c4c419a68 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.200904] env[67977]: DEBUG oslo_vmware.api [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for the task: (returnval){ [ 1743.200904] env[67977]: value = "task-3468258" [ 1743.200904] env[67977]: _type = "Task" [ 1743.200904] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1743.208770] env[67977]: DEBUG oslo_vmware.api [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Task: {'id': task-3468258, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1743.712204] env[67977]: DEBUG oslo_vmware.exceptions [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1743.712490] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1743.713044] env[67977]: ERROR nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1743.713044] env[67977]: Faults: ['InvalidArgument'] [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Traceback (most recent call last): [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] yield resources [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] self.driver.spawn(context, instance, image_meta, [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] self._fetch_image_if_missing(context, vi) [ 1743.713044] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] image_cache(vi, tmp_image_ds_loc) [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] vm_util.copy_virtual_disk( [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] session._wait_for_task(vmdk_copy_task) [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] return self.wait_for_task(task_ref) [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] return evt.wait() [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] result = hub.switch() [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1743.713430] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] return self.greenlet.switch() [ 1743.713865] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1743.713865] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] self.f(*self.args, **self.kw) [ 1743.713865] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1743.713865] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] raise exceptions.translate_fault(task_info.error) [ 1743.713865] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1743.713865] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Faults: ['InvalidArgument'] [ 1743.713865] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] [ 1743.713865] env[67977]: INFO nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Terminating instance [ 1743.715033] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1743.715250] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1743.715486] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8a0e1dde-650f-4211-bea6-5878b210aab5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.717627] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1743.717820] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1743.718569] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81d56d56-0690-4e64-99ac-2f313d2a235f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.725195] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1743.725430] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9fa9d7e3-ce92-4989-a921-8edbcf68f2b6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.727520] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1743.727688] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1743.728616] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f8d49c02-0b2f-40f7-bc67-b2719c104491 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.733602] env[67977]: DEBUG oslo_vmware.api [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 1743.733602] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5207b96d-1a37-3637-0dd9-4002ea058e96" [ 1743.733602] env[67977]: _type = "Task" [ 1743.733602] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1743.740751] env[67977]: DEBUG oslo_vmware.api [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5207b96d-1a37-3637-0dd9-4002ea058e96, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1743.798222] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1743.798440] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1743.798624] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Deleting the datastore file [datastore1] f03fe248-75df-4237-a6dd-cc49012c2331 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1743.798873] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-911bca5e-3aca-4291-af3d-8cadd71aa9a2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.805117] env[67977]: DEBUG oslo_vmware.api [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for the task: (returnval){ [ 1743.805117] env[67977]: value = "task-3468260" [ 1743.805117] env[67977]: _type = "Task" [ 1743.805117] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1743.812482] env[67977]: DEBUG oslo_vmware.api [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Task: {'id': task-3468260, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1744.243596] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1744.243963] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating directory with path [datastore1] vmware_temp/49bffb65-8392-432b-bd4a-33627250f55c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1744.244114] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d8d2d3fb-91a2-4ad9-8e7c-4e77c8975530 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.255738] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created directory with path [datastore1] vmware_temp/49bffb65-8392-432b-bd4a-33627250f55c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1744.255988] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Fetch image to [datastore1] vmware_temp/49bffb65-8392-432b-bd4a-33627250f55c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1744.256132] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/49bffb65-8392-432b-bd4a-33627250f55c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1744.256893] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0e51af3-8717-44d1-9837-20b51bac7af5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.263575] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa0bb67e-2a32-4afc-b8ec-4ff2757304eb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.272863] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bda6a38-f4d5-4883-906c-eba070bdcbf9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.303647] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2f62cbd-b514-46d0-9d40-e4c8eacb4f1f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.315173] env[67977]: DEBUG oslo_vmware.api [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Task: {'id': task-3468260, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075647} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1744.315730] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1744.315942] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1744.316137] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1744.316320] env[67977]: INFO nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1744.317851] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fd478932-0083-48c0-84d4-13812c1fa4a3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.319929] env[67977]: DEBUG nova.compute.claims [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1744.320117] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1744.320337] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1744.343063] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1744.399910] env[67977]: DEBUG oslo_vmware.rw_handles [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/49bffb65-8392-432b-bd4a-33627250f55c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1744.459607] env[67977]: DEBUG oslo_vmware.rw_handles [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1744.459853] env[67977]: DEBUG oslo_vmware.rw_handles [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/49bffb65-8392-432b-bd4a-33627250f55c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1744.560390] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-330fdfc5-82fe-47a3-aae4-c1452bfaad64 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.568265] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17da5e7f-c03c-464d-bf32-a3e65f9f1cb3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.597887] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28b8f00f-8159-4f4a-853e-eee3a55207ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.604705] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37580c51-8985-4358-a963-01511227e0a5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.617486] env[67977]: DEBUG nova.compute.provider_tree [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1744.625604] env[67977]: DEBUG nova.scheduler.client.report [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1744.640609] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.320s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1744.641161] env[67977]: ERROR nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1744.641161] env[67977]: Faults: ['InvalidArgument'] [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Traceback (most recent call last): [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] self.driver.spawn(context, instance, image_meta, [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] self._fetch_image_if_missing(context, vi) [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] image_cache(vi, tmp_image_ds_loc) [ 1744.641161] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] vm_util.copy_virtual_disk( [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] session._wait_for_task(vmdk_copy_task) [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] return self.wait_for_task(task_ref) [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] return evt.wait() [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] result = hub.switch() [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] return self.greenlet.switch() [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1744.641482] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] self.f(*self.args, **self.kw) [ 1744.641788] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1744.641788] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] raise exceptions.translate_fault(task_info.error) [ 1744.641788] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1744.641788] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Faults: ['InvalidArgument'] [ 1744.641788] env[67977]: ERROR nova.compute.manager [instance: f03fe248-75df-4237-a6dd-cc49012c2331] [ 1744.641935] env[67977]: DEBUG nova.compute.utils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1744.643180] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Build of instance f03fe248-75df-4237-a6dd-cc49012c2331 was re-scheduled: A specified parameter was not correct: fileType [ 1744.643180] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1744.643572] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1744.643750] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1744.643984] env[67977]: DEBUG nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1744.644200] env[67977]: DEBUG nova.network.neutron [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1745.101056] env[67977]: DEBUG nova.network.neutron [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1745.113145] env[67977]: INFO nova.compute.manager [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Took 0.47 seconds to deallocate network for instance. [ 1745.211420] env[67977]: INFO nova.scheduler.client.report [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Deleted allocations for instance f03fe248-75df-4237-a6dd-cc49012c2331 [ 1745.232406] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d8c85d07-23ab-43b7-9314-3260828fb356 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "f03fe248-75df-4237-a6dd-cc49012c2331" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 629.235s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1745.233023] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "f03fe248-75df-4237-a6dd-cc49012c2331" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.465s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1745.233287] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "f03fe248-75df-4237-a6dd-cc49012c2331-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1745.234067] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "f03fe248-75df-4237-a6dd-cc49012c2331-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1745.234161] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "f03fe248-75df-4237-a6dd-cc49012c2331-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1745.236200] env[67977]: INFO nova.compute.manager [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Terminating instance [ 1745.237897] env[67977]: DEBUG nova.compute.manager [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1745.238121] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1745.238623] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0a322e31-d66a-41d4-9505-26ff94546cd4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.243081] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1745.252916] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a98828f9-d70a-4c4a-b36a-96a63bd5321b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.282823] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f03fe248-75df-4237-a6dd-cc49012c2331 could not be found. [ 1745.283089] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1745.283280] env[67977]: INFO nova.compute.manager [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1745.283530] env[67977]: DEBUG oslo.service.loopingcall [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1745.286091] env[67977]: DEBUG nova.compute.manager [-] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1745.286285] env[67977]: DEBUG nova.network.neutron [-] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1745.300154] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1745.300401] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1745.301824] env[67977]: INFO nova.compute.claims [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1745.312717] env[67977]: DEBUG nova.network.neutron [-] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1745.321220] env[67977]: INFO nova.compute.manager [-] [instance: f03fe248-75df-4237-a6dd-cc49012c2331] Took 0.03 seconds to deallocate network for instance. [ 1745.411508] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5219e7fe-7611-41ba-bc19-1012165daee6 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "f03fe248-75df-4237-a6dd-cc49012c2331" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.178s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1745.480116] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66113044-c2c0-4acb-9c93-bd389cc71d43 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.487491] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0d4fe95-7875-4130-8405-7d3eb551087a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.518992] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d10cc0ed-94e5-445d-8b40-2b08fcf9acd1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.526120] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e96a9ba-bc8a-4ae7-9c75-76381ee7d625 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.539046] env[67977]: DEBUG nova.compute.provider_tree [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1745.547994] env[67977]: DEBUG nova.scheduler.client.report [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1745.563262] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1745.563651] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1745.596882] env[67977]: DEBUG nova.compute.utils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1745.599033] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1745.599033] env[67977]: DEBUG nova.network.neutron [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1745.609890] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1745.672210] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1745.693791] env[67977]: DEBUG nova.policy [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd35039d87f274119a281d2836618862b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '629b2265a2eb45128d27cb16a9e0304b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1745.697472] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1745.697721] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1745.697883] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1745.698067] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1745.698215] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1745.698424] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1745.698844] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1745.698844] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1745.698996] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1745.699099] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1745.699277] env[67977]: DEBUG nova.virt.hardware [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1745.700155] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ea0da2c-8a33-429d-899c-27cefd245ab4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.708032] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ada20fd6-6fe2-42d4-b83d-283de4108aa9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.982059] env[67977]: DEBUG nova.network.neutron [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Successfully created port: f0d99bff-f65c-4cee-b3b8-288d6f788da2 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1746.664333] env[67977]: DEBUG nova.network.neutron [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Successfully updated port: f0d99bff-f65c-4cee-b3b8-288d6f788da2 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1746.678673] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "refresh_cache-157e3bfe-10cc-49c6-aa31-1d935e1a4465" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1746.678898] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired lock "refresh_cache-157e3bfe-10cc-49c6-aa31-1d935e1a4465" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1746.678983] env[67977]: DEBUG nova.network.neutron [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1746.741593] env[67977]: DEBUG nova.network.neutron [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1746.944323] env[67977]: DEBUG nova.network.neutron [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Updating instance_info_cache with network_info: [{"id": "f0d99bff-f65c-4cee-b3b8-288d6f788da2", "address": "fa:16:3e:aa:7a:04", "network": {"id": "9f9704ef-f97e-4049-b46f-6c90efa33e6e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-765087939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "629b2265a2eb45128d27cb16a9e0304b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf0d99bff-f6", "ovs_interfaceid": "f0d99bff-f65c-4cee-b3b8-288d6f788da2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1746.957171] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Releasing lock "refresh_cache-157e3bfe-10cc-49c6-aa31-1d935e1a4465" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1746.957595] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Instance network_info: |[{"id": "f0d99bff-f65c-4cee-b3b8-288d6f788da2", "address": "fa:16:3e:aa:7a:04", "network": {"id": "9f9704ef-f97e-4049-b46f-6c90efa33e6e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-765087939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "629b2265a2eb45128d27cb16a9e0304b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf0d99bff-f6", "ovs_interfaceid": "f0d99bff-f65c-4cee-b3b8-288d6f788da2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1746.958037] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:aa:7a:04', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '89ef02af-c508-432f-ae29-3a219701d584', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f0d99bff-f65c-4cee-b3b8-288d6f788da2', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1746.965785] env[67977]: DEBUG oslo.service.loopingcall [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1746.966258] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1746.966489] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fc7501e6-ac4c-4f96-8acb-7323910ac54e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.987115] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1746.987115] env[67977]: value = "task-3468261" [ 1746.987115] env[67977]: _type = "Task" [ 1746.987115] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1746.994645] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468261, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1747.144051] env[67977]: DEBUG nova.compute.manager [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Received event network-vif-plugged-f0d99bff-f65c-4cee-b3b8-288d6f788da2 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1747.144296] env[67977]: DEBUG oslo_concurrency.lockutils [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] Acquiring lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1747.144539] env[67977]: DEBUG oslo_concurrency.lockutils [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1747.144729] env[67977]: DEBUG oslo_concurrency.lockutils [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.144914] env[67977]: DEBUG nova.compute.manager [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] No waiting events found dispatching network-vif-plugged-f0d99bff-f65c-4cee-b3b8-288d6f788da2 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1747.145100] env[67977]: WARNING nova.compute.manager [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Received unexpected event network-vif-plugged-f0d99bff-f65c-4cee-b3b8-288d6f788da2 for instance with vm_state building and task_state spawning. [ 1747.145305] env[67977]: DEBUG nova.compute.manager [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Received event network-changed-f0d99bff-f65c-4cee-b3b8-288d6f788da2 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1747.145513] env[67977]: DEBUG nova.compute.manager [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Refreshing instance network info cache due to event network-changed-f0d99bff-f65c-4cee-b3b8-288d6f788da2. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1747.145736] env[67977]: DEBUG oslo_concurrency.lockutils [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] Acquiring lock "refresh_cache-157e3bfe-10cc-49c6-aa31-1d935e1a4465" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1747.145903] env[67977]: DEBUG oslo_concurrency.lockutils [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] Acquired lock "refresh_cache-157e3bfe-10cc-49c6-aa31-1d935e1a4465" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1747.146109] env[67977]: DEBUG nova.network.neutron [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Refreshing network info cache for port f0d99bff-f65c-4cee-b3b8-288d6f788da2 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1747.379215] env[67977]: DEBUG nova.network.neutron [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Updated VIF entry in instance network info cache for port f0d99bff-f65c-4cee-b3b8-288d6f788da2. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1747.379578] env[67977]: DEBUG nova.network.neutron [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Updating instance_info_cache with network_info: [{"id": "f0d99bff-f65c-4cee-b3b8-288d6f788da2", "address": "fa:16:3e:aa:7a:04", "network": {"id": "9f9704ef-f97e-4049-b46f-6c90efa33e6e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-765087939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "629b2265a2eb45128d27cb16a9e0304b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf0d99bff-f6", "ovs_interfaceid": "f0d99bff-f65c-4cee-b3b8-288d6f788da2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1747.389071] env[67977]: DEBUG oslo_concurrency.lockutils [req-48108a26-2056-46ee-8753-162b48cd212b req-f1e2eae4-74b4-439c-9517-acd1ea506648 service nova] Releasing lock "refresh_cache-157e3bfe-10cc-49c6-aa31-1d935e1a4465" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1747.497597] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468261, 'name': CreateVM_Task, 'duration_secs': 0.267792} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1747.497779] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1747.498456] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1747.498636] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1747.498941] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1747.499197] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-70b5c796-8feb-4d42-8566-42da05170e46 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.503438] env[67977]: DEBUG oslo_vmware.api [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 1747.503438] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523f04d2-fe99-54f9-a344-8c4059cd1d9a" [ 1747.503438] env[67977]: _type = "Task" [ 1747.503438] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1747.510802] env[67977]: DEBUG oslo_vmware.api [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]523f04d2-fe99-54f9-a344-8c4059cd1d9a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1748.014055] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1748.014402] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1748.014402] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1754.249105] env[67977]: DEBUG oslo_concurrency.lockutils [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "511896d4-d9cb-42e0-b213-31be3cac191c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1758.431692] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1758.432035] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1758.774631] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1758.774939] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1759.771369] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1759.771659] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1760.775127] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1760.775338] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1761.776018] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1761.776372] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1761.776372] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1761.776530] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1761.787964] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1761.788195] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1761.788363] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1761.788520] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1761.789666] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b901383-c2eb-48d3-bbca-d2a47a01b919 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.798312] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd76ad0c-fe57-4e81-a0e0-c05864de15be {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.812926] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae73f3ca-1e7f-4800-a279-9bee021f9b1d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.818861] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-207c728b-2f63-4d2b-8b73-3a0619928e45 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.848579] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180933MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1761.848735] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1761.848927] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1761.920301] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.920468] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.920591] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.920710] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6fae5126-6618-4337-9a52-d6019727e0b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.920829] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.920948] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.921084] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.921198] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.921311] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.921423] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1761.954091] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1761.954361] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1761.954476] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1762.092166] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21b633c5-8bc4-4dc0-ad23-a73ee468f964 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.100315] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-214f0c16-f2f6-41b4-9376-b2cd09411e3b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.130030] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96a87e19-5e47-4ed6-b432-5d50d2d2a731 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.136491] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f25c112c-cda8-42da-8762-6296a517ce71 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.148937] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1762.157508] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1762.173916] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1762.174117] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.325s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1767.174569] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1767.174995] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1767.174995] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1767.202241] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.202437] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.202573] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.202702] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.202826] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.202949] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.203084] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.203207] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.203326] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.203449] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1767.203614] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1774.461908] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1774.461908] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1788.062828] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1788.063175] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1790.209125] env[67977]: WARNING oslo_vmware.rw_handles [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1790.209125] env[67977]: ERROR oslo_vmware.rw_handles [ 1790.209770] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/49bffb65-8392-432b-bd4a-33627250f55c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1790.211491] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1790.211764] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Copying Virtual Disk [datastore1] vmware_temp/49bffb65-8392-432b-bd4a-33627250f55c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/49bffb65-8392-432b-bd4a-33627250f55c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1790.212065] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e42b5d8c-4264-4c27-b3d3-232e62677708 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.220533] env[67977]: DEBUG oslo_vmware.api [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 1790.220533] env[67977]: value = "task-3468262" [ 1790.220533] env[67977]: _type = "Task" [ 1790.220533] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1790.228494] env[67977]: DEBUG oslo_vmware.api [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468262, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1790.730776] env[67977]: DEBUG oslo_vmware.exceptions [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1790.731056] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1790.731608] env[67977]: ERROR nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1790.731608] env[67977]: Faults: ['InvalidArgument'] [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Traceback (most recent call last): [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] yield resources [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] self.driver.spawn(context, instance, image_meta, [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] self._fetch_image_if_missing(context, vi) [ 1790.731608] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] image_cache(vi, tmp_image_ds_loc) [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] vm_util.copy_virtual_disk( [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] session._wait_for_task(vmdk_copy_task) [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] return self.wait_for_task(task_ref) [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] return evt.wait() [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] result = hub.switch() [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1790.732653] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] return self.greenlet.switch() [ 1790.733304] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1790.733304] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] self.f(*self.args, **self.kw) [ 1790.733304] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1790.733304] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] raise exceptions.translate_fault(task_info.error) [ 1790.733304] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1790.733304] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Faults: ['InvalidArgument'] [ 1790.733304] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] [ 1790.733304] env[67977]: INFO nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Terminating instance [ 1790.733632] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1790.733668] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1790.733908] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c4208c3f-b81a-49c4-abab-8639de96e37c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.736027] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1790.736230] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1790.736932] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a1a8c02-59b4-4f3e-a20b-781378e0d7d1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.745268] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1790.746294] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-baaf3570-9102-4782-851e-c47d5c8513fc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.747560] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1790.747733] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1790.748421] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2504b580-1d94-4a92-a43a-2a38ee87cced {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.753984] env[67977]: DEBUG oslo_vmware.api [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 1790.753984] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525ba4d0-aef5-e42d-62f4-1a65a2b520ac" [ 1790.753984] env[67977]: _type = "Task" [ 1790.753984] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1790.760894] env[67977]: DEBUG oslo_vmware.api [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525ba4d0-aef5-e42d-62f4-1a65a2b520ac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1790.829189] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1790.829465] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1790.829619] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleting the datastore file [datastore1] 5edda5cc-6295-4abe-a21e-0cf684063cb3 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1790.829884] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6de1f4f6-fd7c-4154-b3e9-412bde0e74d5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.835806] env[67977]: DEBUG oslo_vmware.api [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 1790.835806] env[67977]: value = "task-3468264" [ 1790.835806] env[67977]: _type = "Task" [ 1790.835806] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1790.843352] env[67977]: DEBUG oslo_vmware.api [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468264, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1791.264116] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1791.264431] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating directory with path [datastore1] vmware_temp/d2fb53f2-2d34-4ea8-9a46-d068ecb6cbf5/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1791.264616] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5aead1ee-e26b-417a-9b03-6d6f4070ae76 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.275807] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Created directory with path [datastore1] vmware_temp/d2fb53f2-2d34-4ea8-9a46-d068ecb6cbf5/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1791.275999] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Fetch image to [datastore1] vmware_temp/d2fb53f2-2d34-4ea8-9a46-d068ecb6cbf5/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1791.276186] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/d2fb53f2-2d34-4ea8-9a46-d068ecb6cbf5/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1791.276930] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e08d397-8c70-47c1-84cb-a85ab22b4710 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.283233] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc01df34-59af-465b-b09d-2fb238a9ac97 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.291819] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52e860e6-d2e2-4b42-9c41-d47230b7fd61 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.320960] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0063462-f85a-460f-9abb-f94357d95708 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.325955] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c67838e4-2eac-453f-b695-1e919966c456 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.343931] env[67977]: DEBUG oslo_vmware.api [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468264, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074808} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1791.344171] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1791.344347] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1791.344519] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1791.344687] env[67977]: INFO nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1791.346748] env[67977]: DEBUG nova.compute.claims [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1791.346946] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1791.347182] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1791.354065] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1791.408272] env[67977]: DEBUG oslo_vmware.rw_handles [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d2fb53f2-2d34-4ea8-9a46-d068ecb6cbf5/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1791.470227] env[67977]: DEBUG oslo_vmware.rw_handles [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1791.470461] env[67977]: DEBUG oslo_vmware.rw_handles [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d2fb53f2-2d34-4ea8-9a46-d068ecb6cbf5/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1791.585243] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dfa237c-607f-4d46-b471-d43830f9fcb1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.594168] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9f87ea9-456c-475b-9436-47066dc325ea {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.622930] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f24bbfc-0c58-4007-817a-4cfaca32d063 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.629870] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c3d4673-300c-43ce-b1a2-7ce43381d686 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.642426] env[67977]: DEBUG nova.compute.provider_tree [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1791.650589] env[67977]: DEBUG nova.scheduler.client.report [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1791.663537] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.316s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1791.664049] env[67977]: ERROR nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1791.664049] env[67977]: Faults: ['InvalidArgument'] [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Traceback (most recent call last): [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] self.driver.spawn(context, instance, image_meta, [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] self._fetch_image_if_missing(context, vi) [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] image_cache(vi, tmp_image_ds_loc) [ 1791.664049] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] vm_util.copy_virtual_disk( [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] session._wait_for_task(vmdk_copy_task) [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] return self.wait_for_task(task_ref) [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] return evt.wait() [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] result = hub.switch() [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] return self.greenlet.switch() [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1791.664424] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] self.f(*self.args, **self.kw) [ 1791.664834] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1791.664834] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] raise exceptions.translate_fault(task_info.error) [ 1791.664834] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1791.664834] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Faults: ['InvalidArgument'] [ 1791.664834] env[67977]: ERROR nova.compute.manager [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] [ 1791.664834] env[67977]: DEBUG nova.compute.utils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1791.666016] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Build of instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 was re-scheduled: A specified parameter was not correct: fileType [ 1791.666016] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1791.666411] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1791.666586] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1791.666757] env[67977]: DEBUG nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1791.666919] env[67977]: DEBUG nova.network.neutron [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1791.913877] env[67977]: DEBUG nova.network.neutron [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1791.928822] env[67977]: INFO nova.compute.manager [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Took 0.26 seconds to deallocate network for instance. [ 1792.026676] env[67977]: INFO nova.scheduler.client.report [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleted allocations for instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 [ 1792.050739] env[67977]: DEBUG oslo_concurrency.lockutils [None req-d4ff182a-aed9-44d2-9fdc-a7a18b1e9e81 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "5edda5cc-6295-4abe-a21e-0cf684063cb3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 628.758s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1792.052152] env[67977]: DEBUG oslo_concurrency.lockutils [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "5edda5cc-6295-4abe-a21e-0cf684063cb3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.901s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1792.052268] env[67977]: DEBUG oslo_concurrency.lockutils [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "5edda5cc-6295-4abe-a21e-0cf684063cb3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1792.052887] env[67977]: DEBUG oslo_concurrency.lockutils [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "5edda5cc-6295-4abe-a21e-0cf684063cb3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1792.052887] env[67977]: DEBUG oslo_concurrency.lockutils [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "5edda5cc-6295-4abe-a21e-0cf684063cb3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1792.055869] env[67977]: INFO nova.compute.manager [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Terminating instance [ 1792.058251] env[67977]: DEBUG nova.compute.manager [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1792.058581] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1792.058902] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e3344e80-8e7d-43e9-96c7-d6f75052d5cd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.068072] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-322cec1d-0042-42a1-8f33-2ad5be227f8f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.080128] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1792.100306] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5edda5cc-6295-4abe-a21e-0cf684063cb3 could not be found. [ 1792.100517] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1792.100701] env[67977]: INFO nova.compute.manager [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1792.100944] env[67977]: DEBUG oslo.service.loopingcall [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1792.103311] env[67977]: DEBUG nova.compute.manager [-] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1792.103390] env[67977]: DEBUG nova.network.neutron [-] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1792.125779] env[67977]: DEBUG nova.network.neutron [-] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1792.132531] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1792.132769] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1792.134217] env[67977]: INFO nova.compute.claims [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1792.137283] env[67977]: INFO nova.compute.manager [-] [instance: 5edda5cc-6295-4abe-a21e-0cf684063cb3] Took 0.03 seconds to deallocate network for instance. [ 1792.222054] env[67977]: DEBUG oslo_concurrency.lockutils [None req-16ee91af-7415-4a76-9291-3d262ac907bc tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "5edda5cc-6295-4abe-a21e-0cf684063cb3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.170s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1792.323760] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79ebca2f-825a-4d7f-964c-39e6406a687b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.331508] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2840a2b1-fa83-40a9-a2ec-942c2a4d9e39 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.360052] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12b2375e-13a3-4604-a0f5-3a550371cb69 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.366328] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d468a46-d5e2-44b1-a125-9b0af69b8f91 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.380395] env[67977]: DEBUG nova.compute.provider_tree [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1792.388434] env[67977]: DEBUG nova.scheduler.client.report [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1792.400890] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1792.401347] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1792.434776] env[67977]: DEBUG nova.compute.utils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1792.436175] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1792.436376] env[67977]: DEBUG nova.network.neutron [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1792.444444] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1792.499800] env[67977]: DEBUG nova.policy [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd76b3cc7fe2143dabe6ab02906a25097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '52e6b27298274fa1a10d95d9a967814b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1792.505578] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1792.530996] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1792.531253] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1792.531415] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1792.531600] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1792.531748] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1792.531898] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1792.532138] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1792.532300] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1792.532466] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1792.532628] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1792.532798] env[67977]: DEBUG nova.virt.hardware [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1792.533657] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2571efdc-c992-4a0f-afde-6296a9151696 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.541815] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5cb51ea-1e86-4200-9f2b-b905258efe2e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.845644] env[67977]: DEBUG nova.network.neutron [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Successfully created port: a739420a-11a5-4060-8a5c-dc9f2c903298 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1793.683544] env[67977]: DEBUG nova.network.neutron [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Successfully updated port: a739420a-11a5-4060-8a5c-dc9f2c903298 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1793.701100] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "refresh_cache-2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1793.701271] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired lock "refresh_cache-2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1793.701428] env[67977]: DEBUG nova.network.neutron [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1793.737890] env[67977]: DEBUG nova.network.neutron [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1793.892463] env[67977]: DEBUG nova.network.neutron [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Updating instance_info_cache with network_info: [{"id": "a739420a-11a5-4060-8a5c-dc9f2c903298", "address": "fa:16:3e:a1:88:95", "network": {"id": "4eece86d-f584-415a-bcdd-5bde739283be", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1877932954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "52e6b27298274fa1a10d95d9a967814b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "975b168a-03e5-449d-95ac-4d51ba027242", "external-id": "nsx-vlan-transportzone-365", "segmentation_id": 365, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa739420a-11", "ovs_interfaceid": "a739420a-11a5-4060-8a5c-dc9f2c903298", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1793.904378] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Releasing lock "refresh_cache-2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1793.904699] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Instance network_info: |[{"id": "a739420a-11a5-4060-8a5c-dc9f2c903298", "address": "fa:16:3e:a1:88:95", "network": {"id": "4eece86d-f584-415a-bcdd-5bde739283be", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1877932954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "52e6b27298274fa1a10d95d9a967814b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "975b168a-03e5-449d-95ac-4d51ba027242", "external-id": "nsx-vlan-transportzone-365", "segmentation_id": 365, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa739420a-11", "ovs_interfaceid": "a739420a-11a5-4060-8a5c-dc9f2c903298", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1793.905128] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a1:88:95', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '975b168a-03e5-449d-95ac-4d51ba027242', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a739420a-11a5-4060-8a5c-dc9f2c903298', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1793.913217] env[67977]: DEBUG oslo.service.loopingcall [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1793.913698] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1793.913931] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-21ce88af-7738-41e1-b72b-34748f207ca5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.934374] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1793.934374] env[67977]: value = "task-3468265" [ 1793.934374] env[67977]: _type = "Task" [ 1793.934374] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1793.942345] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468265, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1793.950410] env[67977]: DEBUG nova.compute.manager [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Received event network-vif-plugged-a739420a-11a5-4060-8a5c-dc9f2c903298 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1793.950733] env[67977]: DEBUG oslo_concurrency.lockutils [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] Acquiring lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1793.950823] env[67977]: DEBUG oslo_concurrency.lockutils [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1793.950993] env[67977]: DEBUG oslo_concurrency.lockutils [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1793.951182] env[67977]: DEBUG nova.compute.manager [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] No waiting events found dispatching network-vif-plugged-a739420a-11a5-4060-8a5c-dc9f2c903298 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1793.951350] env[67977]: WARNING nova.compute.manager [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Received unexpected event network-vif-plugged-a739420a-11a5-4060-8a5c-dc9f2c903298 for instance with vm_state building and task_state spawning. [ 1793.951515] env[67977]: DEBUG nova.compute.manager [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Received event network-changed-a739420a-11a5-4060-8a5c-dc9f2c903298 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1793.951671] env[67977]: DEBUG nova.compute.manager [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Refreshing instance network info cache due to event network-changed-a739420a-11a5-4060-8a5c-dc9f2c903298. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1793.951855] env[67977]: DEBUG oslo_concurrency.lockutils [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] Acquiring lock "refresh_cache-2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1793.951993] env[67977]: DEBUG oslo_concurrency.lockutils [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] Acquired lock "refresh_cache-2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1793.952167] env[67977]: DEBUG nova.network.neutron [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Refreshing network info cache for port a739420a-11a5-4060-8a5c-dc9f2c903298 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1794.202267] env[67977]: DEBUG nova.network.neutron [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Updated VIF entry in instance network info cache for port a739420a-11a5-4060-8a5c-dc9f2c903298. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1794.202638] env[67977]: DEBUG nova.network.neutron [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Updating instance_info_cache with network_info: [{"id": "a739420a-11a5-4060-8a5c-dc9f2c903298", "address": "fa:16:3e:a1:88:95", "network": {"id": "4eece86d-f584-415a-bcdd-5bde739283be", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1877932954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "52e6b27298274fa1a10d95d9a967814b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "975b168a-03e5-449d-95ac-4d51ba027242", "external-id": "nsx-vlan-transportzone-365", "segmentation_id": 365, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa739420a-11", "ovs_interfaceid": "a739420a-11a5-4060-8a5c-dc9f2c903298", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1794.212261] env[67977]: DEBUG oslo_concurrency.lockutils [req-cc1b5f68-4a8b-4b54-9b45-55b224704410 req-ebc1e151-1fa6-448f-a8cd-348e73c913a7 service nova] Releasing lock "refresh_cache-2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1794.445289] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468265, 'name': CreateVM_Task, 'duration_secs': 0.301008} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1794.445463] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1794.446137] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1794.446310] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1794.446663] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1794.446914] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d0c226ea-8c00-41df-9b20-2917b2c65929 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.451331] env[67977]: DEBUG oslo_vmware.api [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 1794.451331] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52b1bd26-ecac-35e3-97c4-beace159e652" [ 1794.451331] env[67977]: _type = "Task" [ 1794.451331] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1794.458571] env[67977]: DEBUG oslo_vmware.api [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52b1bd26-ecac-35e3-97c4-beace159e652, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1794.962249] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1794.962585] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1794.962716] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1803.337511] env[67977]: DEBUG oslo_concurrency.lockutils [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1819.774690] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1820.770230] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1820.774833] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1821.775055] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1821.775433] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1821.775433] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1822.775798] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1822.776152] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1823.776181] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1823.788221] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.788852] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1823.789145] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1823.789338] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1823.790495] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d84ac66-23fa-4fc4-a1a8-3a43bf3a8b59 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.799593] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1cb5d6b-3376-41f9-b4f1-d013d468db1b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.813523] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e336ad25-df9f-442e-9152-dbc8b8009b1f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.819693] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-674964c2-f834-4495-b60f-4a421f9d5963 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.849593] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180927MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1823.849776] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.849934] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1823.920884] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance e77a441b-952b-42c0-907f-e30888e505a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.921058] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.921192] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6fae5126-6618-4337-9a52-d6019727e0b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.921318] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.921441] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.921559] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.921678] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.921795] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.921912] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.922038] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1823.934465] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1823.946299] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1823.946609] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1823.946827] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1824.076568] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c87414c3-37df-4006-9950-d6806848f2ef {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.084433] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f232a340-91b6-4d25-b0de-0171647ef9ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.113662] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f43a918-ce14-47e3-9037-0d993e2ebe1e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.120749] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2956107e-5e86-48ce-90d3-f6f6d2afad7e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.134024] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1824.144393] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1824.157763] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1824.157863] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.308s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1827.157790] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1827.158096] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1827.158132] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1827.181960] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.182160] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.182267] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.182395] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.182521] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.182644] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.182765] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.182883] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.183009] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.183136] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1827.183257] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1838.161833] env[67977]: WARNING oslo_vmware.rw_handles [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1838.161833] env[67977]: ERROR oslo_vmware.rw_handles [ 1838.162501] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/d2fb53f2-2d34-4ea8-9a46-d068ecb6cbf5/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1838.164257] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1838.164496] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Copying Virtual Disk [datastore1] vmware_temp/d2fb53f2-2d34-4ea8-9a46-d068ecb6cbf5/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/d2fb53f2-2d34-4ea8-9a46-d068ecb6cbf5/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1838.164782] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c10823ad-eaf0-417f-9e11-d6e87a41db9b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1838.172414] env[67977]: DEBUG oslo_vmware.api [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 1838.172414] env[67977]: value = "task-3468266" [ 1838.172414] env[67977]: _type = "Task" [ 1838.172414] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1838.180532] env[67977]: DEBUG oslo_vmware.api [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': task-3468266, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1838.683564] env[67977]: DEBUG oslo_vmware.exceptions [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1838.683802] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1838.684413] env[67977]: ERROR nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1838.684413] env[67977]: Faults: ['InvalidArgument'] [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] Traceback (most recent call last): [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] yield resources [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] self.driver.spawn(context, instance, image_meta, [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] self._fetch_image_if_missing(context, vi) [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1838.684413] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] image_cache(vi, tmp_image_ds_loc) [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] vm_util.copy_virtual_disk( [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] session._wait_for_task(vmdk_copy_task) [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] return self.wait_for_task(task_ref) [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] return evt.wait() [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] result = hub.switch() [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] return self.greenlet.switch() [ 1838.684777] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1838.685128] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] self.f(*self.args, **self.kw) [ 1838.685128] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1838.685128] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] raise exceptions.translate_fault(task_info.error) [ 1838.685128] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1838.685128] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] Faults: ['InvalidArgument'] [ 1838.685128] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] [ 1838.685128] env[67977]: INFO nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Terminating instance [ 1838.686372] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1838.686594] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1838.686833] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2200feb6-85f6-4533-bb3b-b9714c8f8172 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1838.689264] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1838.689484] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1838.690203] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c25598e-fba9-4797-b1bd-11ac4a5759d5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1838.696975] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1838.697206] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-49615b9e-72ff-47b6-9400-4d414cf25607 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1838.699344] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1838.699530] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1838.700489] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c378593b-f8ce-4b94-a624-ae9ce25722e4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1838.705380] env[67977]: DEBUG oslo_vmware.api [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Waiting for the task: (returnval){ [ 1838.705380] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52df37fe-5b8a-77f5-898b-f7cde23b54bd" [ 1838.705380] env[67977]: _type = "Task" [ 1838.705380] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1838.712205] env[67977]: DEBUG oslo_vmware.api [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52df37fe-5b8a-77f5-898b-f7cde23b54bd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1838.772467] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1838.772697] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1838.772864] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Deleting the datastore file [datastore1] e77a441b-952b-42c0-907f-e30888e505a8 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1838.773143] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-af7210e9-5511-4f7e-905d-8a7b6a7c36d4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1838.780172] env[67977]: DEBUG oslo_vmware.api [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 1838.780172] env[67977]: value = "task-3468268" [ 1838.780172] env[67977]: _type = "Task" [ 1838.780172] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1838.788144] env[67977]: DEBUG oslo_vmware.api [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': task-3468268, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1839.216142] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1839.216420] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Creating directory with path [datastore1] vmware_temp/f8f8d356-ee7d-42ad-9328-e037d93c92dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1839.216616] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f24a0eda-00f7-4fd4-841d-03702b8257c9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.227618] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Created directory with path [datastore1] vmware_temp/f8f8d356-ee7d-42ad-9328-e037d93c92dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1839.227810] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Fetch image to [datastore1] vmware_temp/f8f8d356-ee7d-42ad-9328-e037d93c92dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1839.227983] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/f8f8d356-ee7d-42ad-9328-e037d93c92dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1839.228685] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbb977ab-2aa9-43e2-a96b-6442811a36bc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.235094] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bcc32d9-c0a5-4b9f-bc5a-ece64341f31c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.244039] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a42f28c5-729c-444c-89b7-3b625666636f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.274875] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0594d2c-c59c-44b3-8e12-a95616438d0e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.280557] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3c14d3aa-97a1-4318-8ad9-15485600e030 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.289417] env[67977]: DEBUG oslo_vmware.api [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': task-3468268, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065305} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1839.289704] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1839.289891] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1839.290075] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1839.290253] env[67977]: INFO nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1839.292327] env[67977]: DEBUG nova.compute.claims [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1839.292499] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1839.292713] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1839.306028] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1839.360169] env[67977]: DEBUG oslo_vmware.rw_handles [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f8f8d356-ee7d-42ad-9328-e037d93c92dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1839.419683] env[67977]: DEBUG oslo_vmware.rw_handles [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1839.419925] env[67977]: DEBUG oslo_vmware.rw_handles [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f8f8d356-ee7d-42ad-9328-e037d93c92dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1839.527053] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-008b688c-743b-47a4-9828-3272ddf2bdb7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.535016] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df4c674f-cc99-4d82-8e1e-943f8bc38c8b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.563606] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6ee83bc-25ef-47cf-b032-5b438dd070a4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.570105] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce377386-bdb1-4a3c-bf2d-5695e94e331e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1839.583764] env[67977]: DEBUG nova.compute.provider_tree [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1839.592158] env[67977]: DEBUG nova.scheduler.client.report [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1839.608953] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.316s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1839.609531] env[67977]: ERROR nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1839.609531] env[67977]: Faults: ['InvalidArgument'] [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] Traceback (most recent call last): [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] self.driver.spawn(context, instance, image_meta, [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] self._fetch_image_if_missing(context, vi) [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] image_cache(vi, tmp_image_ds_loc) [ 1839.609531] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] vm_util.copy_virtual_disk( [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] session._wait_for_task(vmdk_copy_task) [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] return self.wait_for_task(task_ref) [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] return evt.wait() [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] result = hub.switch() [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] return self.greenlet.switch() [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1839.609920] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] self.f(*self.args, **self.kw) [ 1839.610261] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1839.610261] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] raise exceptions.translate_fault(task_info.error) [ 1839.610261] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1839.610261] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] Faults: ['InvalidArgument'] [ 1839.610261] env[67977]: ERROR nova.compute.manager [instance: e77a441b-952b-42c0-907f-e30888e505a8] [ 1839.610261] env[67977]: DEBUG nova.compute.utils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1839.611628] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Build of instance e77a441b-952b-42c0-907f-e30888e505a8 was re-scheduled: A specified parameter was not correct: fileType [ 1839.611628] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1839.612035] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1839.612218] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1839.612398] env[67977]: DEBUG nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1839.612559] env[67977]: DEBUG nova.network.neutron [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1839.884451] env[67977]: DEBUG nova.network.neutron [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1839.894610] env[67977]: INFO nova.compute.manager [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Took 0.28 seconds to deallocate network for instance. [ 1839.986516] env[67977]: INFO nova.scheduler.client.report [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Deleted allocations for instance e77a441b-952b-42c0-907f-e30888e505a8 [ 1840.007673] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ec2b564b-058e-488a-92cd-4e0a68625915 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "e77a441b-952b-42c0-907f-e30888e505a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 624.712s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1840.008885] env[67977]: DEBUG oslo_concurrency.lockutils [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "e77a441b-952b-42c0-907f-e30888e505a8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.651s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1840.009131] env[67977]: DEBUG oslo_concurrency.lockutils [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "e77a441b-952b-42c0-907f-e30888e505a8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1840.009337] env[67977]: DEBUG oslo_concurrency.lockutils [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "e77a441b-952b-42c0-907f-e30888e505a8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1840.009555] env[67977]: DEBUG oslo_concurrency.lockutils [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "e77a441b-952b-42c0-907f-e30888e505a8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1840.011686] env[67977]: INFO nova.compute.manager [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Terminating instance [ 1840.013806] env[67977]: DEBUG nova.compute.manager [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1840.014013] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1840.014505] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-59d3a9f8-a366-4748-b0ee-452185bc8f71 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1840.024032] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7113fad8-f658-433c-8de0-81ab3e62464d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1840.035446] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1840.057257] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e77a441b-952b-42c0-907f-e30888e505a8 could not be found. [ 1840.057469] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1840.057648] env[67977]: INFO nova.compute.manager [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1840.057900] env[67977]: DEBUG oslo.service.loopingcall [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1840.058163] env[67977]: DEBUG nova.compute.manager [-] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1840.058275] env[67977]: DEBUG nova.network.neutron [-] [instance: e77a441b-952b-42c0-907f-e30888e505a8] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1840.092807] env[67977]: DEBUG nova.network.neutron [-] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1840.100115] env[67977]: INFO nova.compute.manager [-] [instance: e77a441b-952b-42c0-907f-e30888e505a8] Took 0.04 seconds to deallocate network for instance. [ 1840.107907] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1840.108174] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1840.109625] env[67977]: INFO nova.compute.claims [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1840.182494] env[67977]: DEBUG oslo_concurrency.lockutils [None req-db4d09ef-3c18-44ed-934f-96337da4d8a7 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "e77a441b-952b-42c0-907f-e30888e505a8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1840.291295] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-580b0a97-6dd7-4691-a850-6bfb1d397f70 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1840.298736] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b0c1f89-d24a-4de4-a157-bd0182256c22 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1840.327319] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8864ec49-b5a0-4867-93b4-82f8685d7f48 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1840.334145] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4463be5-5ab8-423c-a311-374e012c93f1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1840.346769] env[67977]: DEBUG nova.compute.provider_tree [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1840.355816] env[67977]: DEBUG nova.scheduler.client.report [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1840.373363] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1840.373835] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1840.406817] env[67977]: DEBUG nova.compute.utils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1840.408295] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1840.408930] env[67977]: DEBUG nova.network.neutron [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1840.416345] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1840.480633] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1840.503753] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1840.504012] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1840.504195] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1840.504387] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1840.504536] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1840.504685] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1840.504895] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1840.505179] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1840.505385] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1840.505554] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1840.505732] env[67977]: DEBUG nova.virt.hardware [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1840.506606] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad0c5700-4cec-46de-8c9f-15314c538c76 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1840.514687] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edfd66ef-2d5c-4cbc-a545-a1d5cc5c32aa {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1840.685930] env[67977]: DEBUG nova.policy [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '766cde5830814a7396549aa7288a0aed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf3e35c829af479dbea74ebb00553ca4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1840.986420] env[67977]: DEBUG nova.network.neutron [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Successfully created port: f4794423-2f3c-4eb6-acf0-6e0f806ee37f {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1841.622753] env[67977]: DEBUG nova.network.neutron [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Successfully updated port: f4794423-2f3c-4eb6-acf0-6e0f806ee37f {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1841.636406] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "refresh_cache-d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1841.636562] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquired lock "refresh_cache-d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1841.636718] env[67977]: DEBUG nova.network.neutron [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1841.689712] env[67977]: DEBUG nova.network.neutron [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1841.921555] env[67977]: DEBUG nova.compute.manager [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Received event network-vif-plugged-f4794423-2f3c-4eb6-acf0-6e0f806ee37f {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1841.921778] env[67977]: DEBUG oslo_concurrency.lockutils [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] Acquiring lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1841.921983] env[67977]: DEBUG oslo_concurrency.lockutils [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1841.922173] env[67977]: DEBUG oslo_concurrency.lockutils [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1841.922344] env[67977]: DEBUG nova.compute.manager [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] No waiting events found dispatching network-vif-plugged-f4794423-2f3c-4eb6-acf0-6e0f806ee37f {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1841.922507] env[67977]: WARNING nova.compute.manager [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Received unexpected event network-vif-plugged-f4794423-2f3c-4eb6-acf0-6e0f806ee37f for instance with vm_state building and task_state spawning. [ 1841.922682] env[67977]: DEBUG nova.compute.manager [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Received event network-changed-f4794423-2f3c-4eb6-acf0-6e0f806ee37f {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1841.922833] env[67977]: DEBUG nova.compute.manager [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Refreshing instance network info cache due to event network-changed-f4794423-2f3c-4eb6-acf0-6e0f806ee37f. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1841.922998] env[67977]: DEBUG oslo_concurrency.lockutils [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] Acquiring lock "refresh_cache-d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1841.944618] env[67977]: DEBUG nova.network.neutron [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Updating instance_info_cache with network_info: [{"id": "f4794423-2f3c-4eb6-acf0-6e0f806ee37f", "address": "fa:16:3e:34:d4:f7", "network": {"id": "1164109a-0805-4699-b73b-2f458affef73", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-646858955-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf3e35c829af479dbea74ebb00553ca4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f8442aa5-73db-4599-8564-b98a6ea26b9b", "external-id": "nsx-vlan-transportzone-893", "segmentation_id": 893, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf4794423-2f", "ovs_interfaceid": "f4794423-2f3c-4eb6-acf0-6e0f806ee37f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1841.957033] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Releasing lock "refresh_cache-d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1841.957342] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Instance network_info: |[{"id": "f4794423-2f3c-4eb6-acf0-6e0f806ee37f", "address": "fa:16:3e:34:d4:f7", "network": {"id": "1164109a-0805-4699-b73b-2f458affef73", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-646858955-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf3e35c829af479dbea74ebb00553ca4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f8442aa5-73db-4599-8564-b98a6ea26b9b", "external-id": "nsx-vlan-transportzone-893", "segmentation_id": 893, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf4794423-2f", "ovs_interfaceid": "f4794423-2f3c-4eb6-acf0-6e0f806ee37f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1841.957709] env[67977]: DEBUG oslo_concurrency.lockutils [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] Acquired lock "refresh_cache-d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1841.957899] env[67977]: DEBUG nova.network.neutron [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Refreshing network info cache for port f4794423-2f3c-4eb6-acf0-6e0f806ee37f {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1841.958908] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:d4:f7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f8442aa5-73db-4599-8564-b98a6ea26b9b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f4794423-2f3c-4eb6-acf0-6e0f806ee37f', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1841.966993] env[67977]: DEBUG oslo.service.loopingcall [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1841.967870] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1841.970213] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-44b22799-9d4d-440e-b91f-8fec01859566 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1841.990852] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1841.990852] env[67977]: value = "task-3468269" [ 1841.990852] env[67977]: _type = "Task" [ 1841.990852] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1841.999040] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468269, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1842.248595] env[67977]: DEBUG nova.network.neutron [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Updated VIF entry in instance network info cache for port f4794423-2f3c-4eb6-acf0-6e0f806ee37f. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1842.249048] env[67977]: DEBUG nova.network.neutron [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Updating instance_info_cache with network_info: [{"id": "f4794423-2f3c-4eb6-acf0-6e0f806ee37f", "address": "fa:16:3e:34:d4:f7", "network": {"id": "1164109a-0805-4699-b73b-2f458affef73", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-646858955-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf3e35c829af479dbea74ebb00553ca4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f8442aa5-73db-4599-8564-b98a6ea26b9b", "external-id": "nsx-vlan-transportzone-893", "segmentation_id": 893, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf4794423-2f", "ovs_interfaceid": "f4794423-2f3c-4eb6-acf0-6e0f806ee37f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1842.259603] env[67977]: DEBUG oslo_concurrency.lockutils [req-a2435bfa-beb5-4829-afaf-9b86a7ba03ed req-29f983c0-2a9a-4a30-9a82-d0eec6797fe0 service nova] Releasing lock "refresh_cache-d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1842.501400] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468269, 'name': CreateVM_Task, 'duration_secs': 0.294455} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1842.501561] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1842.502230] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1842.502393] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1842.502705] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1842.502951] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fe6a9c51-7a19-4043-9bd4-c6f9c739357d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1842.507244] env[67977]: DEBUG oslo_vmware.api [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for the task: (returnval){ [ 1842.507244] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52bacbf8-1eb0-437c-86ae-3ae3cdc621da" [ 1842.507244] env[67977]: _type = "Task" [ 1842.507244] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1842.514916] env[67977]: DEBUG oslo_vmware.api [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52bacbf8-1eb0-437c-86ae-3ae3cdc621da, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1843.017331] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1843.017721] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1843.017832] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1881.776207] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1881.802800] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1881.803158] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1881.803380] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1882.775871] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1882.776182] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1882.776339] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1883.775807] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1883.776082] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1885.775644] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1885.789679] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1885.789892] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1885.790074] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1885.790273] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1885.791393] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ff5eda9-f9f4-4698-80d4-db6eea3cf8f2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1885.800305] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fbeb54b-ab0d-47c7-b08e-c9cdbd3524db {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1885.814161] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fe518ba-bf37-4307-8d4f-3999556bb239 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1885.820340] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ade0b59e-c7cf-4172-b609-04d68664c64a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1885.849320] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180923MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1885.849482] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1885.849675] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1885.996584] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1885.996754] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 6fae5126-6618-4337-9a52-d6019727e0b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1885.996887] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1885.997025] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1885.997159] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1885.997280] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1885.997398] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1885.997513] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1885.997627] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1885.997740] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1886.009925] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1886.010162] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1886.010308] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1886.027199] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing inventories for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1886.040443] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating ProviderTree inventory for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1886.040725] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating inventory in ProviderTree for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1886.050945] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing aggregate associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, aggregates: None {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1886.067291] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing trait associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1886.187058] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5257c8cc-6f94-4e49-8e75-51156fbaae77 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1886.195214] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-670e9838-d7ee-4026-ac58-d165dae3f2bb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1886.226309] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6b411ed-9a1b-45cf-975d-6a43ff809149 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1886.233656] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22584f66-aa6f-47fb-b3bf-7642034335da {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1886.246399] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1886.254565] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1886.268470] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1886.268667] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.419s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1888.241335] env[67977]: WARNING oslo_vmware.rw_handles [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1888.241335] env[67977]: ERROR oslo_vmware.rw_handles [ 1888.241953] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/f8f8d356-ee7d-42ad-9328-e037d93c92dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1888.243876] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1888.244174] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Copying Virtual Disk [datastore1] vmware_temp/f8f8d356-ee7d-42ad-9328-e037d93c92dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/f8f8d356-ee7d-42ad-9328-e037d93c92dc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1888.244558] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7ed0905c-bbd3-46be-b905-c23bd151ffd9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1888.253385] env[67977]: DEBUG oslo_vmware.api [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Waiting for the task: (returnval){ [ 1888.253385] env[67977]: value = "task-3468270" [ 1888.253385] env[67977]: _type = "Task" [ 1888.253385] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1888.261546] env[67977]: DEBUG oslo_vmware.api [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Task: {'id': task-3468270, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1888.268880] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1888.269130] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1888.269272] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1888.291898] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.292098] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.292214] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.292347] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.292471] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.292594] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.292715] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.292852] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.293014] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.293124] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1888.293226] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1888.763565] env[67977]: DEBUG oslo_vmware.exceptions [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1888.763837] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1888.764409] env[67977]: ERROR nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1888.764409] env[67977]: Faults: ['InvalidArgument'] [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Traceback (most recent call last): [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] yield resources [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] self.driver.spawn(context, instance, image_meta, [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] self._fetch_image_if_missing(context, vi) [ 1888.764409] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] image_cache(vi, tmp_image_ds_loc) [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] vm_util.copy_virtual_disk( [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] session._wait_for_task(vmdk_copy_task) [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] return self.wait_for_task(task_ref) [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] return evt.wait() [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] result = hub.switch() [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1888.764824] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] return self.greenlet.switch() [ 1888.765245] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1888.765245] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] self.f(*self.args, **self.kw) [ 1888.765245] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1888.765245] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] raise exceptions.translate_fault(task_info.error) [ 1888.765245] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1888.765245] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Faults: ['InvalidArgument'] [ 1888.765245] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] [ 1888.765245] env[67977]: INFO nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Terminating instance [ 1888.766314] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1888.766518] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1888.766756] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-da517df1-c862-4a60-9284-86c1a4118039 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1888.769101] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1888.769304] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1888.770024] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af8bfe19-9fc8-4906-869e-73f5f17a7307 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1888.776722] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1888.776868] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances with incomplete migration {{(pid=67977) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1888.777768] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1888.778887] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-87e21e79-be92-457f-8bfa-646a3e1a4e94 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1888.780233] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1888.780407] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1888.781254] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-563c6bce-fe9d-4182-a72e-6544806530f3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1888.786111] env[67977]: DEBUG oslo_vmware.api [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 1888.786111] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]524056bc-9918-1f5c-8d0c-63bd5b587c8d" [ 1888.786111] env[67977]: _type = "Task" [ 1888.786111] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1888.790091] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1888.796357] env[67977]: DEBUG oslo_vmware.api [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]524056bc-9918-1f5c-8d0c-63bd5b587c8d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1888.857990] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1888.858225] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1888.858409] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Deleting the datastore file [datastore1] 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1888.858712] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7ec95547-e6c7-4a5e-a8d9-4a9b8a364650 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1888.865132] env[67977]: DEBUG oslo_vmware.api [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Waiting for the task: (returnval){ [ 1888.865132] env[67977]: value = "task-3468272" [ 1888.865132] env[67977]: _type = "Task" [ 1888.865132] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1888.874024] env[67977]: DEBUG oslo_vmware.api [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Task: {'id': task-3468272, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1889.296995] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1889.297322] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating directory with path [datastore1] vmware_temp/e15ff709-b7fc-4b38-99a7-b6e1c4c5f28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1889.297518] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9543d85e-d326-4c3b-8069-391c647ebd38 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.308879] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Created directory with path [datastore1] vmware_temp/e15ff709-b7fc-4b38-99a7-b6e1c4c5f28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1889.309096] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Fetch image to [datastore1] vmware_temp/e15ff709-b7fc-4b38-99a7-b6e1c4c5f28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1889.309674] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/e15ff709-b7fc-4b38-99a7-b6e1c4c5f28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1889.310033] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fed18b2-3334-431c-9eab-a794f0c72755 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.317261] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07d221f8-4083-4343-be3d-94d9da25a4da {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.326306] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2af87a40-9acf-4d8f-8a4d-cf1e29e1659a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.356451] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dc93b42-0afb-4f41-b12b-fec551c0e0b5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.362163] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1fa2a5d3-74f8-4567-86d2-23f97db9da5d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.373162] env[67977]: DEBUG oslo_vmware.api [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Task: {'id': task-3468272, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076106} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1889.373403] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1889.373585] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1889.373759] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1889.373935] env[67977]: INFO nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1889.376163] env[67977]: DEBUG nova.compute.claims [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1889.376339] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1889.376553] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1889.386331] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1889.513722] env[67977]: DEBUG oslo_vmware.rw_handles [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e15ff709-b7fc-4b38-99a7-b6e1c4c5f28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1889.575609] env[67977]: DEBUG oslo_vmware.rw_handles [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1889.575820] env[67977]: DEBUG oslo_vmware.rw_handles [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e15ff709-b7fc-4b38-99a7-b6e1c4c5f28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1889.614624] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e1ca1e-9a4c-449f-9aa6-64cb565a7230 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.621939] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-202dde1d-178e-4cc0-983b-66d80b9b578f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.652154] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66ad8467-c0a7-467f-8ff3-3bf2c12ca420 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.659031] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-900fb3a2-2cc0-4f07-a8d3-003f32eaabcf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.671592] env[67977]: DEBUG nova.compute.provider_tree [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1889.679743] env[67977]: DEBUG nova.scheduler.client.report [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1889.694331] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.318s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1889.694839] env[67977]: ERROR nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1889.694839] env[67977]: Faults: ['InvalidArgument'] [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Traceback (most recent call last): [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] self.driver.spawn(context, instance, image_meta, [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] self._fetch_image_if_missing(context, vi) [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] image_cache(vi, tmp_image_ds_loc) [ 1889.694839] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] vm_util.copy_virtual_disk( [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] session._wait_for_task(vmdk_copy_task) [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] return self.wait_for_task(task_ref) [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] return evt.wait() [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] result = hub.switch() [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] return self.greenlet.switch() [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1889.695238] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] self.f(*self.args, **self.kw) [ 1889.695609] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1889.695609] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] raise exceptions.translate_fault(task_info.error) [ 1889.695609] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1889.695609] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Faults: ['InvalidArgument'] [ 1889.695609] env[67977]: ERROR nova.compute.manager [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] [ 1889.695609] env[67977]: DEBUG nova.compute.utils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1889.696808] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Build of instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 was re-scheduled: A specified parameter was not correct: fileType [ 1889.696808] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1889.697197] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1889.697369] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1889.697536] env[67977]: DEBUG nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1889.697699] env[67977]: DEBUG nova.network.neutron [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1890.093520] env[67977]: DEBUG nova.network.neutron [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1890.106457] env[67977]: INFO nova.compute.manager [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Took 0.41 seconds to deallocate network for instance. [ 1890.211276] env[67977]: INFO nova.scheduler.client.report [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Deleted allocations for instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 [ 1890.233887] env[67977]: DEBUG oslo_concurrency.lockutils [None req-737421b3-19b2-4f44-bf29-8a79bb6df555 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 619.540s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1890.235022] env[67977]: DEBUG oslo_concurrency.lockutils [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 423.024s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1890.235534] env[67977]: DEBUG oslo_concurrency.lockutils [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Acquiring lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1890.235534] env[67977]: DEBUG oslo_concurrency.lockutils [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1890.235685] env[67977]: DEBUG oslo_concurrency.lockutils [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1890.237538] env[67977]: INFO nova.compute.manager [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Terminating instance [ 1890.239317] env[67977]: DEBUG nova.compute.manager [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1890.239408] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1890.240024] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-96720bb9-4517-49a9-9b17-0cde826da788 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.245498] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1890.253023] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57b01308-e8be-4124-b3ad-1c7f614083f2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.280848] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2 could not be found. [ 1890.281062] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1890.281244] env[67977]: INFO nova.compute.manager [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1890.281479] env[67977]: DEBUG oslo.service.loopingcall [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1890.283613] env[67977]: DEBUG nova.compute.manager [-] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1890.283718] env[67977]: DEBUG nova.network.neutron [-] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1890.297151] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1890.297400] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1890.298973] env[67977]: INFO nova.compute.claims [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1890.310429] env[67977]: DEBUG nova.network.neutron [-] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1890.320522] env[67977]: INFO nova.compute.manager [-] [instance: 1d31ce6c-18e6-45b3-acc7-2339d2ed62a2] Took 0.04 seconds to deallocate network for instance. [ 1890.415237] env[67977]: DEBUG oslo_concurrency.lockutils [None req-050c504a-494d-4927-b797-b0a221ea16d8 tempest-ServerActionsTestOtherA-1136155460 tempest-ServerActionsTestOtherA-1136155460-project-member] Lock "1d31ce6c-18e6-45b3-acc7-2339d2ed62a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.180s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1890.478253] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df1b49a2-c203-4627-9628-d8950d77b0fe {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.486293] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c98961ed-73d1-4844-9bc0-af64ad09813d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.515787] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60e8c898-f9f9-4de4-b05f-fc5857a07a38 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.522543] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6736ac6d-1c22-4512-8763-c3db83ff8ce8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.535129] env[67977]: DEBUG nova.compute.provider_tree [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1890.543941] env[67977]: DEBUG nova.scheduler.client.report [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1890.557197] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1890.557643] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1890.586998] env[67977]: DEBUG nova.compute.utils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1890.589552] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1890.589724] env[67977]: DEBUG nova.network.neutron [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1890.598522] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1890.654409] env[67977]: DEBUG nova.policy [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4965d451810c48458246493019d83172', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d528c04bd83409eb74e20393651c040', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1890.663739] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1890.689116] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1890.689358] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1890.689514] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1890.689693] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1890.689844] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1890.689991] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1890.691356] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1890.691551] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1890.691744] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1890.691932] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1890.692132] env[67977]: DEBUG nova.virt.hardware [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1890.692995] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0f97a0c-0e3a-4fd1-836b-7a23122fcf6d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.701548] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b980cdc-4188-4ee7-8caf-b1cc3f8c9d2d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.795958] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1890.796176] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1890.805223] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] There are 0 instances to clean {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1890.932198] env[67977]: DEBUG nova.network.neutron [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Successfully created port: 4bb7af47-c67f-4117-beb3-1c889de5dd9e {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1891.615681] env[67977]: DEBUG nova.network.neutron [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Successfully updated port: 4bb7af47-c67f-4117-beb3-1c889de5dd9e {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1891.627074] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "refresh_cache-ad0b21ff-90be-4a78-8cc7-b347df8579a9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1891.627074] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired lock "refresh_cache-ad0b21ff-90be-4a78-8cc7-b347df8579a9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1891.627074] env[67977]: DEBUG nova.network.neutron [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1891.669515] env[67977]: DEBUG nova.network.neutron [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1891.857617] env[67977]: DEBUG nova.network.neutron [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Updating instance_info_cache with network_info: [{"id": "4bb7af47-c67f-4117-beb3-1c889de5dd9e", "address": "fa:16:3e:d3:4e:80", "network": {"id": "da4332ac-6c00-499e-81de-2b64bd556acc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1328497137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d528c04bd83409eb74e20393651c040", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ded18042-834c-4792-b3e8-b1c377446432", "external-id": "nsx-vlan-transportzone-293", "segmentation_id": 293, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4bb7af47-c6", "ovs_interfaceid": "4bb7af47-c67f-4117-beb3-1c889de5dd9e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1891.868881] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Releasing lock "refresh_cache-ad0b21ff-90be-4a78-8cc7-b347df8579a9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1891.869190] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Instance network_info: |[{"id": "4bb7af47-c67f-4117-beb3-1c889de5dd9e", "address": "fa:16:3e:d3:4e:80", "network": {"id": "da4332ac-6c00-499e-81de-2b64bd556acc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1328497137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d528c04bd83409eb74e20393651c040", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ded18042-834c-4792-b3e8-b1c377446432", "external-id": "nsx-vlan-transportzone-293", "segmentation_id": 293, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4bb7af47-c6", "ovs_interfaceid": "4bb7af47-c67f-4117-beb3-1c889de5dd9e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1891.869589] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d3:4e:80', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ded18042-834c-4792-b3e8-b1c377446432', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4bb7af47-c67f-4117-beb3-1c889de5dd9e', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1891.877210] env[67977]: DEBUG oslo.service.loopingcall [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1891.877673] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1891.877899] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ac26b992-38d6-43b4-b7a6-fd29c9a9de1c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1891.898739] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1891.898739] env[67977]: value = "task-3468273" [ 1891.898739] env[67977]: _type = "Task" [ 1891.898739] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1891.906550] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468273, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1892.162193] env[67977]: DEBUG nova.compute.manager [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Received event network-vif-plugged-4bb7af47-c67f-4117-beb3-1c889de5dd9e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1892.162193] env[67977]: DEBUG oslo_concurrency.lockutils [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] Acquiring lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1892.162314] env[67977]: DEBUG oslo_concurrency.lockutils [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1892.162480] env[67977]: DEBUG oslo_concurrency.lockutils [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1892.162594] env[67977]: DEBUG nova.compute.manager [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] No waiting events found dispatching network-vif-plugged-4bb7af47-c67f-4117-beb3-1c889de5dd9e {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1892.162726] env[67977]: WARNING nova.compute.manager [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Received unexpected event network-vif-plugged-4bb7af47-c67f-4117-beb3-1c889de5dd9e for instance with vm_state building and task_state spawning. [ 1892.162886] env[67977]: DEBUG nova.compute.manager [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Received event network-changed-4bb7af47-c67f-4117-beb3-1c889de5dd9e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1892.163191] env[67977]: DEBUG nova.compute.manager [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Refreshing instance network info cache due to event network-changed-4bb7af47-c67f-4117-beb3-1c889de5dd9e. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1892.163399] env[67977]: DEBUG oslo_concurrency.lockutils [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] Acquiring lock "refresh_cache-ad0b21ff-90be-4a78-8cc7-b347df8579a9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1892.163539] env[67977]: DEBUG oslo_concurrency.lockutils [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] Acquired lock "refresh_cache-ad0b21ff-90be-4a78-8cc7-b347df8579a9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1892.163778] env[67977]: DEBUG nova.network.neutron [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Refreshing network info cache for port 4bb7af47-c67f-4117-beb3-1c889de5dd9e {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1892.408535] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468273, 'name': CreateVM_Task, 'duration_secs': 0.292616} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1892.408814] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1892.409374] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1892.409544] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1892.409867] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1892.410130] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-91ffbbaf-86cc-4f88-83ed-8813407a9688 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.414847] env[67977]: DEBUG oslo_vmware.api [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for the task: (returnval){ [ 1892.414847] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52985d48-b011-6b02-39f4-0cddb16bebf2" [ 1892.414847] env[67977]: _type = "Task" [ 1892.414847] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1892.423815] env[67977]: DEBUG oslo_vmware.api [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52985d48-b011-6b02-39f4-0cddb16bebf2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1892.432241] env[67977]: DEBUG nova.network.neutron [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Updated VIF entry in instance network info cache for port 4bb7af47-c67f-4117-beb3-1c889de5dd9e. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1892.432548] env[67977]: DEBUG nova.network.neutron [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Updating instance_info_cache with network_info: [{"id": "4bb7af47-c67f-4117-beb3-1c889de5dd9e", "address": "fa:16:3e:d3:4e:80", "network": {"id": "da4332ac-6c00-499e-81de-2b64bd556acc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1328497137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d528c04bd83409eb74e20393651c040", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ded18042-834c-4792-b3e8-b1c377446432", "external-id": "nsx-vlan-transportzone-293", "segmentation_id": 293, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4bb7af47-c6", "ovs_interfaceid": "4bb7af47-c67f-4117-beb3-1c889de5dd9e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1892.441709] env[67977]: DEBUG oslo_concurrency.lockutils [req-bd6536c2-c66d-4d34-99f8-f1a8b3b7dc6c req-7771f318-ab13-4a93-ac66-36f67ab3e9e6 service nova] Releasing lock "refresh_cache-ad0b21ff-90be-4a78-8cc7-b347df8579a9" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1892.926032] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1892.926032] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1892.926032] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1904.681669] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1904.704325] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Getting list of instances from cluster (obj){ [ 1904.704325] env[67977]: value = "domain-c8" [ 1904.704325] env[67977]: _type = "ClusterComputeResource" [ 1904.704325] env[67977]: } {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1904.705606] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28a24986-d108-4b93-bfec-e420a6ca1b03 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.723545] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Got total of 10 instances {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1904.723707] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 6fae5126-6618-4337-9a52-d6019727e0b0 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.723906] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid b56ab7a8-cd27-4542-8082-ec023c57e153 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.724082] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid ac4fe863-2435-48ed-9c7c-9e7144be8e70 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.724245] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.724398] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid d1fc2ae5-fa11-41a7-808b-13da16667078 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.724548] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 511896d4-d9cb-42e0-b213-31be3cac191c {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.724699] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 157e3bfe-10cc-49c6-aa31-1d935e1a4465 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.724844] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.724988] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.725154] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid ad0b21ff-90be-4a78-8cc7-b347df8579a9 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1904.725485] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "6fae5126-6618-4337-9a52-d6019727e0b0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.725720] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "b56ab7a8-cd27-4542-8082-ec023c57e153" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.725919] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.726133] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.726357] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "d1fc2ae5-fa11-41a7-808b-13da16667078" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.726594] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "511896d4-d9cb-42e0-b213-31be3cac191c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.726757] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.726947] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.727153] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.727347] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1908.585189] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1908.586022] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1935.265053] env[67977]: WARNING oslo_vmware.rw_handles [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1935.265053] env[67977]: ERROR oslo_vmware.rw_handles [ 1935.265729] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/e15ff709-b7fc-4b38-99a7-b6e1c4c5f28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1935.267406] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1935.267660] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Copying Virtual Disk [datastore1] vmware_temp/e15ff709-b7fc-4b38-99a7-b6e1c4c5f28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/e15ff709-b7fc-4b38-99a7-b6e1c4c5f28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1935.267969] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6f711333-0264-44eb-b365-153dbf88ec29 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1935.275594] env[67977]: DEBUG oslo_vmware.api [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 1935.275594] env[67977]: value = "task-3468274" [ 1935.275594] env[67977]: _type = "Task" [ 1935.275594] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1935.283523] env[67977]: DEBUG oslo_vmware.api [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': task-3468274, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1935.786597] env[67977]: DEBUG oslo_vmware.exceptions [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1935.786899] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1935.787473] env[67977]: ERROR nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1935.787473] env[67977]: Faults: ['InvalidArgument'] [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Traceback (most recent call last): [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] yield resources [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] self.driver.spawn(context, instance, image_meta, [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] self._fetch_image_if_missing(context, vi) [ 1935.787473] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] image_cache(vi, tmp_image_ds_loc) [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] vm_util.copy_virtual_disk( [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] session._wait_for_task(vmdk_copy_task) [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] return self.wait_for_task(task_ref) [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] return evt.wait() [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] result = hub.switch() [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1935.787840] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] return self.greenlet.switch() [ 1935.788224] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1935.788224] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] self.f(*self.args, **self.kw) [ 1935.788224] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1935.788224] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] raise exceptions.translate_fault(task_info.error) [ 1935.788224] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1935.788224] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Faults: ['InvalidArgument'] [ 1935.788224] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] [ 1935.788224] env[67977]: INFO nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Terminating instance [ 1935.789477] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1935.789692] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1935.789935] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a33b6731-752b-4caa-bf92-cfd4abd4996c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1935.793026] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1935.793026] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1935.793586] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71e8a322-a6cd-4fa1-a454-bc55ec9cc9ed {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1935.798673] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1935.798842] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1935.799866] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-99937f01-ec55-4063-bb40-731b5446c09d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1935.804412] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1935.804940] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6a55a59a-f642-4054-add9-407b28e24a33 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1935.807338] env[67977]: DEBUG oslo_vmware.api [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for the task: (returnval){ [ 1935.807338] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5258c35d-106c-3ef7-0fc1-cf5d61d584e3" [ 1935.807338] env[67977]: _type = "Task" [ 1935.807338] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1935.815348] env[67977]: DEBUG oslo_vmware.api [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5258c35d-106c-3ef7-0fc1-cf5d61d584e3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1935.880217] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1935.880472] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1935.880596] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Deleting the datastore file [datastore1] 6fae5126-6618-4337-9a52-d6019727e0b0 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1935.880857] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a18042a5-5960-4296-ade5-118bce6f63ad {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1935.887385] env[67977]: DEBUG oslo_vmware.api [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 1935.887385] env[67977]: value = "task-3468276" [ 1935.887385] env[67977]: _type = "Task" [ 1935.887385] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1935.894897] env[67977]: DEBUG oslo_vmware.api [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': task-3468276, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1936.317888] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1936.318257] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Creating directory with path [datastore1] vmware_temp/0cef9645-5b58-40de-85ca-be0290b640bc/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1936.318397] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-64257b61-abde-4bd1-81f0-460eab7b000a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.329350] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Created directory with path [datastore1] vmware_temp/0cef9645-5b58-40de-85ca-be0290b640bc/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1936.329537] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Fetch image to [datastore1] vmware_temp/0cef9645-5b58-40de-85ca-be0290b640bc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1936.329707] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/0cef9645-5b58-40de-85ca-be0290b640bc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1936.330451] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acb31a5e-aaa3-4ed7-835a-1bca8b9e6939 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.336858] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d6a328e-96dc-4fa8-a566-923ffb907029 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.345721] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5be0467c-3752-45d1-a0c2-3043182a4dd3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.375177] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7bf4ae9-f135-4a76-a010-b2ca69097618 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.380417] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-85da60a1-9d5e-407e-80a3-52aa29c878a7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.395550] env[67977]: DEBUG oslo_vmware.api [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': task-3468276, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064251} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1936.395790] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1936.395969] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1936.396171] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1936.396340] env[67977]: INFO nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1936.398707] env[67977]: DEBUG nova.compute.claims [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1936.398907] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1936.399163] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1936.403068] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1936.452297] env[67977]: DEBUG oslo_vmware.rw_handles [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0cef9645-5b58-40de-85ca-be0290b640bc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1936.511528] env[67977]: DEBUG oslo_vmware.rw_handles [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1936.511692] env[67977]: DEBUG oslo_vmware.rw_handles [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0cef9645-5b58-40de-85ca-be0290b640bc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1936.627588] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8923535e-bf9b-4b55-a696-2e6f12d94614 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.636478] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dde15af-6496-4397-923d-c413809a9fd5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.665468] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a1cab8d-7e99-4a96-8abb-f34bbf42d74d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.672331] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b877c6f-83bb-41df-b8aa-74b499c8ccc0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1936.684909] env[67977]: DEBUG nova.compute.provider_tree [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1936.693351] env[67977]: DEBUG nova.scheduler.client.report [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1936.706951] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.308s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1936.707482] env[67977]: ERROR nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1936.707482] env[67977]: Faults: ['InvalidArgument'] [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Traceback (most recent call last): [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] self.driver.spawn(context, instance, image_meta, [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] self._fetch_image_if_missing(context, vi) [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] image_cache(vi, tmp_image_ds_loc) [ 1936.707482] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] vm_util.copy_virtual_disk( [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] session._wait_for_task(vmdk_copy_task) [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] return self.wait_for_task(task_ref) [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] return evt.wait() [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] result = hub.switch() [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] return self.greenlet.switch() [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1936.707940] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] self.f(*self.args, **self.kw) [ 1936.708329] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1936.708329] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] raise exceptions.translate_fault(task_info.error) [ 1936.708329] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1936.708329] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Faults: ['InvalidArgument'] [ 1936.708329] env[67977]: ERROR nova.compute.manager [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] [ 1936.708329] env[67977]: DEBUG nova.compute.utils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1936.709452] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Build of instance 6fae5126-6618-4337-9a52-d6019727e0b0 was re-scheduled: A specified parameter was not correct: fileType [ 1936.709452] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1936.709835] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1936.710031] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1936.710212] env[67977]: DEBUG nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1936.710374] env[67977]: DEBUG nova.network.neutron [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1937.042441] env[67977]: DEBUG nova.network.neutron [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1937.053181] env[67977]: INFO nova.compute.manager [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Took 0.34 seconds to deallocate network for instance. [ 1937.143280] env[67977]: INFO nova.scheduler.client.report [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Deleted allocations for instance 6fae5126-6618-4337-9a52-d6019727e0b0 [ 1937.166794] env[67977]: DEBUG oslo_concurrency.lockutils [None req-3391fc65-34f1-495e-86c1-192f93a3346a tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "6fae5126-6618-4337-9a52-d6019727e0b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 572.072s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1937.168111] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "6fae5126-6618-4337-9a52-d6019727e0b0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 375.955s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1937.168404] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "6fae5126-6618-4337-9a52-d6019727e0b0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1937.168640] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "6fae5126-6618-4337-9a52-d6019727e0b0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1937.168844] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "6fae5126-6618-4337-9a52-d6019727e0b0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1937.171020] env[67977]: INFO nova.compute.manager [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Terminating instance [ 1937.172815] env[67977]: DEBUG nova.compute.manager [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1937.173120] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1937.173719] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-acceeb8f-88d7-44f4-b368-e9cd6601fbde {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1937.183638] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c6c9d26-fe0c-467b-90f2-6d63bb46a383 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1937.195564] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1937.217129] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6fae5126-6618-4337-9a52-d6019727e0b0 could not be found. [ 1937.217334] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1937.217963] env[67977]: INFO nova.compute.manager [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1937.217963] env[67977]: DEBUG oslo.service.loopingcall [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1937.218116] env[67977]: DEBUG nova.compute.manager [-] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1937.218116] env[67977]: DEBUG nova.network.neutron [-] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1937.245520] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1937.245806] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1937.247380] env[67977]: INFO nova.compute.claims [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1937.435389] env[67977]: DEBUG nova.network.neutron [-] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1937.445347] env[67977]: INFO nova.compute.manager [-] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] Took 0.23 seconds to deallocate network for instance. [ 1937.465981] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcfd3ada-dbec-4cb0-b053-2d244a257976 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1937.474063] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51d7998e-3d7a-4d6a-be88-f2c35bb88c4c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1937.508237] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-933ffa41-480f-4edd-ba4a-80afe9c791b5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1937.515588] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5f92dca-a5cc-4914-93ee-0449c36894ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1937.534976] env[67977]: DEBUG nova.compute.provider_tree [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1937.543035] env[67977]: DEBUG nova.scheduler.client.report [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1937.559341] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1937.559961] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1937.565208] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4005025a-f646-4bde-9957-0d979afd85dd tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "6fae5126-6618-4337-9a52-d6019727e0b0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.397s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1937.566557] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "6fae5126-6618-4337-9a52-d6019727e0b0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 32.841s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1937.566739] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 6fae5126-6618-4337-9a52-d6019727e0b0] During sync_power_state the instance has a pending task (deleting). Skip. [ 1937.566914] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "6fae5126-6618-4337-9a52-d6019727e0b0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1937.593079] env[67977]: DEBUG nova.compute.utils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1937.595770] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1937.595770] env[67977]: DEBUG nova.network.neutron [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1937.602168] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1937.663769] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1937.680649] env[67977]: DEBUG nova.policy [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd71a62e4fe3f4a59b7606ef17ea6d0b9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6a2fbea0321a44e7ac6812f9856e8116', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1937.691047] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1937.691291] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1937.691448] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1937.691648] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1937.691799] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1937.692272] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1937.692537] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1937.692707] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1937.692892] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1937.693096] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1937.693280] env[67977]: DEBUG nova.virt.hardware [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1937.694223] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f163a14-eaa7-4a60-9044-ed78ed114e0d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1937.702385] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-490bc8c1-09d8-4a3a-ae62-960edc9eeb38 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1938.009382] env[67977]: DEBUG nova.network.neutron [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Successfully created port: bd004906-0376-4d82-99b6-8849ba812e9a {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1938.692557] env[67977]: DEBUG nova.network.neutron [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Successfully updated port: bd004906-0376-4d82-99b6-8849ba812e9a {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1938.706317] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "refresh_cache-98f7c8cc-b27f-406c-b34d-22c2c4609e24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1938.706504] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquired lock "refresh_cache-98f7c8cc-b27f-406c-b34d-22c2c4609e24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1938.706704] env[67977]: DEBUG nova.network.neutron [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1938.744448] env[67977]: DEBUG nova.network.neutron [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1938.897431] env[67977]: DEBUG nova.network.neutron [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Updating instance_info_cache with network_info: [{"id": "bd004906-0376-4d82-99b6-8849ba812e9a", "address": "fa:16:3e:6a:d2:74", "network": {"id": "8472d1a8-8b3a-40f7-a74a-3449f67e4cb2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1078881203-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6a2fbea0321a44e7ac6812f9856e8116", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6815237d-f565-474d-a3c0-9c675478eb00", "external-id": "nsx-vlan-transportzone-526", "segmentation_id": 526, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbd004906-03", "ovs_interfaceid": "bd004906-0376-4d82-99b6-8849ba812e9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1938.909907] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Releasing lock "refresh_cache-98f7c8cc-b27f-406c-b34d-22c2c4609e24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1938.910224] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Instance network_info: |[{"id": "bd004906-0376-4d82-99b6-8849ba812e9a", "address": "fa:16:3e:6a:d2:74", "network": {"id": "8472d1a8-8b3a-40f7-a74a-3449f67e4cb2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1078881203-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6a2fbea0321a44e7ac6812f9856e8116", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6815237d-f565-474d-a3c0-9c675478eb00", "external-id": "nsx-vlan-transportzone-526", "segmentation_id": 526, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbd004906-03", "ovs_interfaceid": "bd004906-0376-4d82-99b6-8849ba812e9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1938.910609] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6a:d2:74', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6815237d-f565-474d-a3c0-9c675478eb00', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bd004906-0376-4d82-99b6-8849ba812e9a', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1938.918105] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Creating folder: Project (6a2fbea0321a44e7ac6812f9856e8116). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1938.918590] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-239d0818-6f9a-4e6c-9f5e-b5bff18d4e41 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1938.930436] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Created folder: Project (6a2fbea0321a44e7ac6812f9856e8116) in parent group-v693022. [ 1938.930615] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Creating folder: Instances. Parent ref: group-v693118. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1938.930830] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-26a6e971-7583-4324-b543-8575c007f87a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1938.939706] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Created folder: Instances in parent group-v693118. [ 1938.939706] env[67977]: DEBUG oslo.service.loopingcall [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1938.939706] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1938.939706] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-835ce614-82b5-464d-9e7f-860650b42813 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1938.960331] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1938.960331] env[67977]: value = "task-3468279" [ 1938.960331] env[67977]: _type = "Task" [ 1938.960331] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1938.967812] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468279, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1939.079124] env[67977]: DEBUG nova.compute.manager [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Received event network-vif-plugged-bd004906-0376-4d82-99b6-8849ba812e9a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1939.079340] env[67977]: DEBUG oslo_concurrency.lockutils [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] Acquiring lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1939.079556] env[67977]: DEBUG oslo_concurrency.lockutils [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] Lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1939.079742] env[67977]: DEBUG oslo_concurrency.lockutils [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] Lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1939.079898] env[67977]: DEBUG nova.compute.manager [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] No waiting events found dispatching network-vif-plugged-bd004906-0376-4d82-99b6-8849ba812e9a {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1939.080075] env[67977]: WARNING nova.compute.manager [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Received unexpected event network-vif-plugged-bd004906-0376-4d82-99b6-8849ba812e9a for instance with vm_state building and task_state spawning. [ 1939.080242] env[67977]: DEBUG nova.compute.manager [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Received event network-changed-bd004906-0376-4d82-99b6-8849ba812e9a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1939.080395] env[67977]: DEBUG nova.compute.manager [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Refreshing instance network info cache due to event network-changed-bd004906-0376-4d82-99b6-8849ba812e9a. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1939.080575] env[67977]: DEBUG oslo_concurrency.lockutils [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] Acquiring lock "refresh_cache-98f7c8cc-b27f-406c-b34d-22c2c4609e24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1939.080710] env[67977]: DEBUG oslo_concurrency.lockutils [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] Acquired lock "refresh_cache-98f7c8cc-b27f-406c-b34d-22c2c4609e24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1939.080863] env[67977]: DEBUG nova.network.neutron [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Refreshing network info cache for port bd004906-0376-4d82-99b6-8849ba812e9a {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1939.352237] env[67977]: DEBUG nova.network.neutron [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Updated VIF entry in instance network info cache for port bd004906-0376-4d82-99b6-8849ba812e9a. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1939.352585] env[67977]: DEBUG nova.network.neutron [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Updating instance_info_cache with network_info: [{"id": "bd004906-0376-4d82-99b6-8849ba812e9a", "address": "fa:16:3e:6a:d2:74", "network": {"id": "8472d1a8-8b3a-40f7-a74a-3449f67e4cb2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1078881203-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6a2fbea0321a44e7ac6812f9856e8116", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6815237d-f565-474d-a3c0-9c675478eb00", "external-id": "nsx-vlan-transportzone-526", "segmentation_id": 526, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbd004906-03", "ovs_interfaceid": "bd004906-0376-4d82-99b6-8849ba812e9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1939.361524] env[67977]: DEBUG oslo_concurrency.lockutils [req-ecc72ee2-518a-4d79-a353-8bf5a948a420 req-f8333887-4a21-47e6-bc2e-bd5a3f144665 service nova] Releasing lock "refresh_cache-98f7c8cc-b27f-406c-b34d-22c2c4609e24" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1939.472198] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468279, 'name': CreateVM_Task, 'duration_secs': 0.285787} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1939.472343] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1939.472962] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1939.473141] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1939.473451] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1939.473693] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c3918f03-55fe-4ca2-bec6-74e080cf279b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1939.477811] env[67977]: DEBUG oslo_vmware.api [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Waiting for the task: (returnval){ [ 1939.477811] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52099a60-0f14-2229-8b88-0254f94447b4" [ 1939.477811] env[67977]: _type = "Task" [ 1939.477811] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1939.484889] env[67977]: DEBUG oslo_vmware.api [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52099a60-0f14-2229-8b88-0254f94447b4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1939.988323] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1939.988649] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1939.988753] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1941.821023] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1941.821342] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1942.776862] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1944.771580] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1944.775255] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1944.775444] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1944.775587] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1945.775659] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1947.775176] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1947.775559] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1947.775559] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1947.800131] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.800334] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.800432] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.800557] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.800680] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.800801] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.800933] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.801070] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.801194] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.801312] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.801433] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1947.801946] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1947.814871] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1947.815030] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1947.815202] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1947.815359] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1947.816445] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58b470b3-3416-4cb1-ba3a-374c72a060d7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.825274] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a3e4609-9343-4509-a7d8-b79f8ed45517 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.839176] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56007593-9c88-40c1-b7a5-acc58278c4ca {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.845526] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0253a8c1-49c5-4fb4-a363-9586d5aef029 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.874136] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180880MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1947.874320] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1947.874477] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1947.945797] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance b56ab7a8-cd27-4542-8082-ec023c57e153 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.945962] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.946106] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.946233] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.946358] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.946478] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.946595] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.946710] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.946823] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.946938] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1947.947155] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1947.947355] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1948.073184] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9e46aa8-47bf-4275-bdcb-c6fc3b100ceb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1948.081212] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f0d5a49-4134-4fb7-ac19-5a44e858707d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1948.114102] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d8694ad-8cd8-4a9b-b72a-9f9faebb0aa0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1948.121490] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66b60b93-974d-4126-a269-70a190a62e8e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1948.134319] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1948.142421] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1948.160340] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1948.160535] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.286s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1951.308676] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "142d3b29-b467-4007-84ac-8b7e0ee9e326" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1951.308931] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "142d3b29-b467-4007-84ac-8b7e0ee9e326" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1955.170320] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1970.788105] env[67977]: DEBUG oslo_concurrency.lockutils [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1983.211842] env[67977]: WARNING oslo_vmware.rw_handles [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1983.211842] env[67977]: ERROR oslo_vmware.rw_handles [ 1983.211842] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/0cef9645-5b58-40de-85ca-be0290b640bc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1983.215104] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1983.215376] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Copying Virtual Disk [datastore1] vmware_temp/0cef9645-5b58-40de-85ca-be0290b640bc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/0cef9645-5b58-40de-85ca-be0290b640bc/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1983.215683] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4a4413e6-767c-4fd1-8dae-b613cba20008 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.224397] env[67977]: DEBUG oslo_vmware.api [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for the task: (returnval){ [ 1983.224397] env[67977]: value = "task-3468280" [ 1983.224397] env[67977]: _type = "Task" [ 1983.224397] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1983.232245] env[67977]: DEBUG oslo_vmware.api [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Task: {'id': task-3468280, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1983.735678] env[67977]: DEBUG oslo_vmware.exceptions [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1983.735678] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1983.735967] env[67977]: ERROR nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1983.735967] env[67977]: Faults: ['InvalidArgument'] [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Traceback (most recent call last): [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] yield resources [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] self.driver.spawn(context, instance, image_meta, [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] self._fetch_image_if_missing(context, vi) [ 1983.735967] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] image_cache(vi, tmp_image_ds_loc) [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] vm_util.copy_virtual_disk( [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] session._wait_for_task(vmdk_copy_task) [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] return self.wait_for_task(task_ref) [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] return evt.wait() [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] result = hub.switch() [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1983.736440] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] return self.greenlet.switch() [ 1983.736818] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1983.736818] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] self.f(*self.args, **self.kw) [ 1983.736818] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1983.736818] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] raise exceptions.translate_fault(task_info.error) [ 1983.736818] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1983.736818] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Faults: ['InvalidArgument'] [ 1983.736818] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] [ 1983.736818] env[67977]: INFO nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Terminating instance [ 1983.740149] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1983.740149] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1983.740149] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-981d9f80-bb59-4d96-9789-32b45d51ead9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.741527] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1983.741731] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1983.742480] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c05cc976-3306-4750-96eb-08f6923a97f4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.749357] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1983.749612] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6fc50e37-c455-4063-bdc8-9b01fd0b5b9f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.751808] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1983.751984] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1983.752974] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f0c5dd41-c9f6-4ddd-b910-f91f955ccb99 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.758076] env[67977]: DEBUG oslo_vmware.api [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for the task: (returnval){ [ 1983.758076] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525eed63-7ca7-f26c-9855-94dacce2cb83" [ 1983.758076] env[67977]: _type = "Task" [ 1983.758076] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1983.764990] env[67977]: DEBUG oslo_vmware.api [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525eed63-7ca7-f26c-9855-94dacce2cb83, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1983.823400] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1983.823634] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1983.823833] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Deleting the datastore file [datastore1] b56ab7a8-cd27-4542-8082-ec023c57e153 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1983.824116] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6490426f-89a4-4c80-a8c4-6a270fcdc348 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.830063] env[67977]: DEBUG oslo_vmware.api [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for the task: (returnval){ [ 1983.830063] env[67977]: value = "task-3468282" [ 1983.830063] env[67977]: _type = "Task" [ 1983.830063] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1983.837447] env[67977]: DEBUG oslo_vmware.api [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Task: {'id': task-3468282, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1984.268318] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1984.268671] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Creating directory with path [datastore1] vmware_temp/5b1d5152-57f2-4b41-ab93-1d74a4d2ac54/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1984.268821] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-811c23bb-1429-4155-af8a-78148861c477 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.280605] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Created directory with path [datastore1] vmware_temp/5b1d5152-57f2-4b41-ab93-1d74a4d2ac54/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1984.280605] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Fetch image to [datastore1] vmware_temp/5b1d5152-57f2-4b41-ab93-1d74a4d2ac54/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1984.280605] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/5b1d5152-57f2-4b41-ab93-1d74a4d2ac54/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1984.281253] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3af7ebf8-9f9d-4efd-b4ae-b1687386d25d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.284461] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1984.289140] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea411812-0de1-4a45-84ad-c63e54879eed {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.299937] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad2b8beb-d39e-4029-89a5-37c60a7a86bc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.331244] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96f1b932-b9c1-4144-ae79-68df31b63d8e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.341925] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bb85b8a1-053c-4018-9ab7-5d9ac30bd5a0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.343406] env[67977]: DEBUG oslo_vmware.api [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Task: {'id': task-3468282, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074177} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1984.343648] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1984.343823] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1984.343997] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1984.344216] env[67977]: INFO nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1984.346239] env[67977]: DEBUG nova.compute.claims [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1984.346445] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1984.346664] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1984.363701] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1984.415760] env[67977]: DEBUG oslo_vmware.rw_handles [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b1d5152-57f2-4b41-ab93-1d74a4d2ac54/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1984.475540] env[67977]: DEBUG oslo_vmware.rw_handles [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1984.475703] env[67977]: DEBUG oslo_vmware.rw_handles [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b1d5152-57f2-4b41-ab93-1d74a4d2ac54/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1984.580274] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eef7326-8c17-4ff6-bbcb-bdda9934507d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.587381] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bc352e4-1d18-4980-936a-53ce33f78ef8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.618622] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db57f31e-c2d3-46d8-95fe-9d12baea2642 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.625363] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c946e0c-ab12-4b3e-b958-42b25f7facd3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.637915] env[67977]: DEBUG nova.compute.provider_tree [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1984.646419] env[67977]: DEBUG nova.scheduler.client.report [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1984.662189] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.315s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1984.662700] env[67977]: ERROR nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1984.662700] env[67977]: Faults: ['InvalidArgument'] [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Traceback (most recent call last): [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] self.driver.spawn(context, instance, image_meta, [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] self._fetch_image_if_missing(context, vi) [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] image_cache(vi, tmp_image_ds_loc) [ 1984.662700] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] vm_util.copy_virtual_disk( [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] session._wait_for_task(vmdk_copy_task) [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] return self.wait_for_task(task_ref) [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] return evt.wait() [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] result = hub.switch() [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] return self.greenlet.switch() [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1984.663122] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] self.f(*self.args, **self.kw) [ 1984.663523] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1984.663523] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] raise exceptions.translate_fault(task_info.error) [ 1984.663523] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1984.663523] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Faults: ['InvalidArgument'] [ 1984.663523] env[67977]: ERROR nova.compute.manager [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] [ 1984.663523] env[67977]: DEBUG nova.compute.utils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1984.664839] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Build of instance b56ab7a8-cd27-4542-8082-ec023c57e153 was re-scheduled: A specified parameter was not correct: fileType [ 1984.664839] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1984.665225] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1984.665401] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1984.665569] env[67977]: DEBUG nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1984.665738] env[67977]: DEBUG nova.network.neutron [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1985.207967] env[67977]: DEBUG nova.network.neutron [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1985.222405] env[67977]: INFO nova.compute.manager [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Took 0.56 seconds to deallocate network for instance. [ 1985.318742] env[67977]: INFO nova.scheduler.client.report [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Deleted allocations for instance b56ab7a8-cd27-4542-8082-ec023c57e153 [ 1985.338839] env[67977]: DEBUG oslo_concurrency.lockutils [None req-27d61baf-4023-4a57-bd85-b938761947a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 604.398s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1985.339982] env[67977]: DEBUG oslo_concurrency.lockutils [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 408.357s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1985.340467] env[67977]: DEBUG oslo_concurrency.lockutils [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "b56ab7a8-cd27-4542-8082-ec023c57e153-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1985.340704] env[67977]: DEBUG oslo_concurrency.lockutils [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1985.340879] env[67977]: DEBUG oslo_concurrency.lockutils [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1985.342907] env[67977]: INFO nova.compute.manager [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Terminating instance [ 1985.344839] env[67977]: DEBUG nova.compute.manager [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1985.345100] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1985.345528] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-46ace58c-2881-4593-af73-f15e71ef494a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1985.354195] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9180635f-99db-4b61-8495-4f7dc50eb1c1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1985.364943] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1985.385922] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b56ab7a8-cd27-4542-8082-ec023c57e153 could not be found. [ 1985.386137] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1985.386318] env[67977]: INFO nova.compute.manager [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1985.386552] env[67977]: DEBUG oslo.service.loopingcall [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1985.387024] env[67977]: DEBUG nova.compute.manager [-] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1985.387167] env[67977]: DEBUG nova.network.neutron [-] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1985.411382] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1985.411646] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1985.413119] env[67977]: INFO nova.compute.claims [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1985.416141] env[67977]: DEBUG nova.network.neutron [-] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1985.424822] env[67977]: INFO nova.compute.manager [-] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] Took 0.04 seconds to deallocate network for instance. [ 1985.546458] env[67977]: DEBUG oslo_concurrency.lockutils [None req-739709e3-16c9-477a-981d-55978e97c4a7 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.206s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1985.547328] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 80.822s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1985.547524] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: b56ab7a8-cd27-4542-8082-ec023c57e153] During sync_power_state the instance has a pending task (deleting). Skip. [ 1985.547700] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "b56ab7a8-cd27-4542-8082-ec023c57e153" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1985.591569] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c5ba6f7-9035-48cb-abf6-32e8fac35a5d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1985.599279] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac976f9e-e180-4887-b7be-f18f08649d7f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1985.630368] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4366f91-2c15-49d5-8707-fd17a8a499f5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1985.637374] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db39be33-50d1-48eb-a8c1-b589ce0501de {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1985.650018] env[67977]: DEBUG nova.compute.provider_tree [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1985.657829] env[67977]: DEBUG nova.scheduler.client.report [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1985.671452] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1985.671925] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1985.704597] env[67977]: DEBUG nova.compute.utils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1985.705792] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1985.705960] env[67977]: DEBUG nova.network.neutron [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1985.715435] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1985.765104] env[67977]: DEBUG nova.policy [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78df84566c65469890b3b6f15f3e5e01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ff581ae563e45108f497cade6990d79', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 1985.775336] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1985.799924] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1985.800193] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1985.800352] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1985.800592] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1985.800680] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1985.800816] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1985.801029] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1985.801225] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1985.801399] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1985.801561] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1985.801732] env[67977]: DEBUG nova.virt.hardware [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1985.802595] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68472b96-8213-4d7c-bda8-4e0424d48caf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1985.810695] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-695b5296-a4df-4796-978d-d910203d0df1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.058030] env[67977]: DEBUG nova.network.neutron [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Successfully created port: 46a529dc-c0ef-4042-8b5e-f2d7af2f209a {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1986.730741] env[67977]: DEBUG nova.network.neutron [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Successfully updated port: 46a529dc-c0ef-4042-8b5e-f2d7af2f209a {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1986.742095] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "refresh_cache-142d3b29-b467-4007-84ac-8b7e0ee9e326" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1986.742251] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "refresh_cache-142d3b29-b467-4007-84ac-8b7e0ee9e326" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1986.742401] env[67977]: DEBUG nova.network.neutron [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1986.801824] env[67977]: DEBUG nova.network.neutron [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1986.991699] env[67977]: DEBUG nova.network.neutron [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Updating instance_info_cache with network_info: [{"id": "46a529dc-c0ef-4042-8b5e-f2d7af2f209a", "address": "fa:16:3e:12:b3:2d", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap46a529dc-c0", "ovs_interfaceid": "46a529dc-c0ef-4042-8b5e-f2d7af2f209a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1987.004409] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "refresh_cache-142d3b29-b467-4007-84ac-8b7e0ee9e326" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1987.004678] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Instance network_info: |[{"id": "46a529dc-c0ef-4042-8b5e-f2d7af2f209a", "address": "fa:16:3e:12:b3:2d", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap46a529dc-c0", "ovs_interfaceid": "46a529dc-c0ef-4042-8b5e-f2d7af2f209a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1987.005058] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:12:b3:2d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5efce30e-48dd-493a-a354-f562a8adf7af', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '46a529dc-c0ef-4042-8b5e-f2d7af2f209a', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1987.012643] env[67977]: DEBUG oslo.service.loopingcall [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1987.013131] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1987.013370] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dbfcf4f2-d7cc-4901-a5e0-9c2ce7bcfd45 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1987.033711] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1987.033711] env[67977]: value = "task-3468283" [ 1987.033711] env[67977]: _type = "Task" [ 1987.033711] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1987.041247] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468283, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1987.250743] env[67977]: DEBUG nova.compute.manager [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Received event network-vif-plugged-46a529dc-c0ef-4042-8b5e-f2d7af2f209a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1987.251222] env[67977]: DEBUG oslo_concurrency.lockutils [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] Acquiring lock "142d3b29-b467-4007-84ac-8b7e0ee9e326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1987.251274] env[67977]: DEBUG oslo_concurrency.lockutils [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] Lock "142d3b29-b467-4007-84ac-8b7e0ee9e326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1987.251468] env[67977]: DEBUG oslo_concurrency.lockutils [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] Lock "142d3b29-b467-4007-84ac-8b7e0ee9e326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1987.251614] env[67977]: DEBUG nova.compute.manager [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] No waiting events found dispatching network-vif-plugged-46a529dc-c0ef-4042-8b5e-f2d7af2f209a {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1987.251797] env[67977]: WARNING nova.compute.manager [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Received unexpected event network-vif-plugged-46a529dc-c0ef-4042-8b5e-f2d7af2f209a for instance with vm_state building and task_state spawning. [ 1987.251964] env[67977]: DEBUG nova.compute.manager [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Received event network-changed-46a529dc-c0ef-4042-8b5e-f2d7af2f209a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1987.252130] env[67977]: DEBUG nova.compute.manager [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Refreshing instance network info cache due to event network-changed-46a529dc-c0ef-4042-8b5e-f2d7af2f209a. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1987.252318] env[67977]: DEBUG oslo_concurrency.lockutils [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] Acquiring lock "refresh_cache-142d3b29-b467-4007-84ac-8b7e0ee9e326" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1987.252455] env[67977]: DEBUG oslo_concurrency.lockutils [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] Acquired lock "refresh_cache-142d3b29-b467-4007-84ac-8b7e0ee9e326" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1987.252636] env[67977]: DEBUG nova.network.neutron [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Refreshing network info cache for port 46a529dc-c0ef-4042-8b5e-f2d7af2f209a {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1987.527949] env[67977]: DEBUG nova.network.neutron [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Updated VIF entry in instance network info cache for port 46a529dc-c0ef-4042-8b5e-f2d7af2f209a. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1987.528319] env[67977]: DEBUG nova.network.neutron [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Updating instance_info_cache with network_info: [{"id": "46a529dc-c0ef-4042-8b5e-f2d7af2f209a", "address": "fa:16:3e:12:b3:2d", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap46a529dc-c0", "ovs_interfaceid": "46a529dc-c0ef-4042-8b5e-f2d7af2f209a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1987.539031] env[67977]: DEBUG oslo_concurrency.lockutils [req-34baf269-6af8-4c7f-b795-f3fd72e556ea req-ba129672-b149-4e56-847b-9ffd5a9a2a49 service nova] Releasing lock "refresh_cache-142d3b29-b467-4007-84ac-8b7e0ee9e326" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1987.545208] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468283, 'name': CreateVM_Task, 'duration_secs': 0.289874} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1987.545363] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1987.551330] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1987.551497] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1987.551805] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1987.552047] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-23b1e6ae-7484-425a-a2f1-5a022f10c8b6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1987.556626] env[67977]: DEBUG oslo_vmware.api [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 1987.556626] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]529a2a74-d5a8-685f-6bf0-dea077f1b92b" [ 1987.556626] env[67977]: _type = "Task" [ 1987.556626] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1987.564211] env[67977]: DEBUG oslo_vmware.api [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]529a2a74-d5a8-685f-6bf0-dea077f1b92b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1988.068868] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1988.069186] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1988.069339] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2000.819371] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "1536ad10-129b-439d-80c5-08fa92aeaed1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2000.819680] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "1536ad10-129b-439d-80c5-08fa92aeaed1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2002.133676] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2002.770584] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2002.793242] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2003.774754] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2004.775731] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2004.776097] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2005.770791] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2005.774342] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2006.775835] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2007.776207] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2007.776585] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2007.776585] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2007.797361] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.797568] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.797615] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.797789] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.797937] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.798073] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.798198] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.798318] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.798436] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.798552] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2007.798672] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2009.775523] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2009.787287] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2009.787521] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2009.787689] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2009.787849] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2009.789083] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb552e87-d254-42bf-9fcd-3824247023cd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.798038] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16ea45fd-c1ca-4b79-ad89-eed4fba3e407 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.811949] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac62a5f0-90bf-48f4-b4f3-949ab0e992d2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.818101] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-120e7e4f-8d0d-4ebc-a408-97d14d20655a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.848434] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180917MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2009.848591] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2009.848789] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2009.916773] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.916983] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.917136] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.917262] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.917382] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.917501] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.917617] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.917735] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.917854] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.917970] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2009.928696] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2009.928904] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2009.929063] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2010.052312] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-087ee4a8-a39e-41b8-afdb-aba02c50a320 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.059992] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73179448-80cc-4b01-9349-0a35bed8d392 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.089414] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7359e731-2fb3-4b9e-bcbe-95d4ac3f555c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.096313] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47ac598b-2e22-4522-8fe6-fb707382f166 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.109088] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2010.117166] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2010.132621] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2010.132804] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2033.230322] env[67977]: WARNING oslo_vmware.rw_handles [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2033.230322] env[67977]: ERROR oslo_vmware.rw_handles [ 2033.231075] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/5b1d5152-57f2-4b41-ab93-1d74a4d2ac54/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2033.232790] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2033.233039] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Copying Virtual Disk [datastore1] vmware_temp/5b1d5152-57f2-4b41-ab93-1d74a4d2ac54/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/5b1d5152-57f2-4b41-ab93-1d74a4d2ac54/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2033.233348] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-16363ab5-0737-4a36-9829-9e1fc8d41186 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2033.241335] env[67977]: DEBUG oslo_vmware.api [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for the task: (returnval){ [ 2033.241335] env[67977]: value = "task-3468284" [ 2033.241335] env[67977]: _type = "Task" [ 2033.241335] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2033.249443] env[67977]: DEBUG oslo_vmware.api [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': task-3468284, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2033.751954] env[67977]: DEBUG oslo_vmware.exceptions [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2033.752281] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2033.752841] env[67977]: ERROR nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2033.752841] env[67977]: Faults: ['InvalidArgument'] [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Traceback (most recent call last): [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] yield resources [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] self.driver.spawn(context, instance, image_meta, [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] self._fetch_image_if_missing(context, vi) [ 2033.752841] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] image_cache(vi, tmp_image_ds_loc) [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] vm_util.copy_virtual_disk( [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] session._wait_for_task(vmdk_copy_task) [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] return self.wait_for_task(task_ref) [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] return evt.wait() [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] result = hub.switch() [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2033.753239] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] return self.greenlet.switch() [ 2033.753600] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2033.753600] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] self.f(*self.args, **self.kw) [ 2033.753600] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2033.753600] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] raise exceptions.translate_fault(task_info.error) [ 2033.753600] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2033.753600] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Faults: ['InvalidArgument'] [ 2033.753600] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] [ 2033.753600] env[67977]: INFO nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Terminating instance [ 2033.754745] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2033.754954] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2033.755205] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-45ea9917-6679-4b9c-bd77-8cef8e462e5c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2033.757408] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2033.757603] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2033.758332] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f38cc8f8-7ba5-4210-9fa3-767b6ca2c0d5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2033.765186] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2033.765390] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f1d6cc42-6f91-4b43-add4-31a1a0662f13 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2033.767486] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2033.767662] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2033.768589] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4e48311c-c51c-4f14-a70f-3e1a7a78f6e1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2033.773604] env[67977]: DEBUG oslo_vmware.api [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for the task: (returnval){ [ 2033.773604] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52f56a43-c878-5024-a4a4-a3c13270dd09" [ 2033.773604] env[67977]: _type = "Task" [ 2033.773604] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2033.780421] env[67977]: DEBUG oslo_vmware.api [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52f56a43-c878-5024-a4a4-a3c13270dd09, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2033.838098] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2033.838641] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2033.838641] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Deleting the datastore file [datastore1] ac4fe863-2435-48ed-9c7c-9e7144be8e70 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2033.839204] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c7e1700e-6596-40c8-a088-6ab3499745de {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2033.846137] env[67977]: DEBUG oslo_vmware.api [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for the task: (returnval){ [ 2033.846137] env[67977]: value = "task-3468286" [ 2033.846137] env[67977]: _type = "Task" [ 2033.846137] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2033.853885] env[67977]: DEBUG oslo_vmware.api [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': task-3468286, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2034.284555] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2034.284879] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Creating directory with path [datastore1] vmware_temp/6755e172-346f-4a2e-a3e8-5acd2716c28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2034.285047] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-52c47a9f-d4d7-4afb-93bf-1f61eab6e2f7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.296339] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Created directory with path [datastore1] vmware_temp/6755e172-346f-4a2e-a3e8-5acd2716c28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2034.297129] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Fetch image to [datastore1] vmware_temp/6755e172-346f-4a2e-a3e8-5acd2716c28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2034.297129] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/6755e172-346f-4a2e-a3e8-5acd2716c28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2034.297464] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c083c22b-9642-4d6e-a584-ae3806010845 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.304401] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17d3bf4d-475d-42e4-9001-d4e334184f4c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.313779] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f6f40be-4a4d-4ccd-ba0b-77734a9b041a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.344789] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-413823b6-33c5-4aae-8175-1c4516ce6047 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.356134] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1f8e6221-cbf2-43d0-9803-a92571446743 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.357778] env[67977]: DEBUG oslo_vmware.api [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': task-3468286, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077471} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2034.358081] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2034.358271] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2034.358441] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2034.358611] env[67977]: INFO nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2034.360706] env[67977]: DEBUG nova.compute.claims [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2034.360860] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2034.361090] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2034.378098] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2034.516036] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6755e172-346f-4a2e-a3e8-5acd2716c28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2034.574748] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2034.574936] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6755e172-346f-4a2e-a3e8-5acd2716c28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2034.583172] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-838090f5-f208-4cd1-af5e-70b83db21074 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.590956] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6c59856-b237-44f3-a920-b0686d7c2667 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.619759] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a86e354c-affd-494f-b506-8c13d5fd1fe2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.626673] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76bb9f83-f84b-4647-a712-cbe820c872ec {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.643689] env[67977]: DEBUG nova.compute.provider_tree [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2034.652267] env[67977]: DEBUG nova.scheduler.client.report [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2034.667891] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.307s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2034.668429] env[67977]: ERROR nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2034.668429] env[67977]: Faults: ['InvalidArgument'] [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Traceback (most recent call last): [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] self.driver.spawn(context, instance, image_meta, [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] self._fetch_image_if_missing(context, vi) [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] image_cache(vi, tmp_image_ds_loc) [ 2034.668429] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] vm_util.copy_virtual_disk( [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] session._wait_for_task(vmdk_copy_task) [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] return self.wait_for_task(task_ref) [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] return evt.wait() [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] result = hub.switch() [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] return self.greenlet.switch() [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2034.668798] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] self.f(*self.args, **self.kw) [ 2034.669177] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2034.669177] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] raise exceptions.translate_fault(task_info.error) [ 2034.669177] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2034.669177] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Faults: ['InvalidArgument'] [ 2034.669177] env[67977]: ERROR nova.compute.manager [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] [ 2034.669177] env[67977]: DEBUG nova.compute.utils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2034.670438] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Build of instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 was re-scheduled: A specified parameter was not correct: fileType [ 2034.670438] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2034.670862] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2034.671090] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2034.671268] env[67977]: DEBUG nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2034.671432] env[67977]: DEBUG nova.network.neutron [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2034.933352] env[67977]: DEBUG nova.network.neutron [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2034.948118] env[67977]: INFO nova.compute.manager [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Took 0.28 seconds to deallocate network for instance. [ 2035.041055] env[67977]: INFO nova.scheduler.client.report [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Deleted allocations for instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 [ 2035.064963] env[67977]: DEBUG oslo_concurrency.lockutils [None req-0d9e2601-5cf7-43f0-b531-98f967e6a202 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 639.714s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2035.066193] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 443.836s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2035.066193] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2035.066366] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2035.066459] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2035.068672] env[67977]: INFO nova.compute.manager [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Terminating instance [ 2035.070503] env[67977]: DEBUG nova.compute.manager [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2035.070725] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2035.071376] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-62fff818-2979-4e6f-8218-0517bca17721 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.081535] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-008ad1ab-4f43-4e87-bf95-4e4665b4aacd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.093121] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2035.114545] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ac4fe863-2435-48ed-9c7c-9e7144be8e70 could not be found. [ 2035.114752] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2035.114930] env[67977]: INFO nova.compute.manager [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2035.115191] env[67977]: DEBUG oslo.service.loopingcall [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2035.115452] env[67977]: DEBUG nova.compute.manager [-] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2035.115557] env[67977]: DEBUG nova.network.neutron [-] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2035.141651] env[67977]: DEBUG nova.network.neutron [-] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2035.149139] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2035.149505] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2035.151492] env[67977]: INFO nova.compute.claims [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2035.154970] env[67977]: INFO nova.compute.manager [-] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] Took 0.04 seconds to deallocate network for instance. [ 2035.248292] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ba9db95c-bbe4-42e7-857a-2f8d4fc5c9a3 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2035.249135] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 130.523s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2035.249361] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ac4fe863-2435-48ed-9c7c-9e7144be8e70] During sync_power_state the instance has a pending task (deleting). Skip. [ 2035.249547] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "ac4fe863-2435-48ed-9c7c-9e7144be8e70" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2035.325578] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc3a31d6-359c-466e-811c-f1cfb6ca45f7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.334190] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1d65541-fb56-4cb7-bfeb-809973dc0fff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.363590] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-189d08c6-c827-4add-9814-565005196694 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.371974] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4a0f295-6098-4115-ad51-88a1f723c292 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.385305] env[67977]: DEBUG nova.compute.provider_tree [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2035.393923] env[67977]: DEBUG nova.scheduler.client.report [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2035.408413] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2035.408948] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2035.444095] env[67977]: DEBUG nova.compute.utils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2035.445132] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2035.445686] env[67977]: DEBUG nova.network.neutron [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2035.454221] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2035.519958] env[67977]: DEBUG nova.policy [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd35039d87f274119a281d2836618862b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '629b2265a2eb45128d27cb16a9e0304b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 2035.540900] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2035.567035] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2035.567315] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2035.567476] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2035.567656] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2035.567803] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2035.567947] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2035.568169] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2035.568327] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2035.568536] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2035.568655] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2035.568875] env[67977]: DEBUG nova.virt.hardware [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2035.569829] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d369a53-5342-4352-b1c0-0edf702d302d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.577315] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d60a46e6-2f40-4b4e-b1cd-742d68a5f555 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.795259] env[67977]: DEBUG nova.network.neutron [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Successfully created port: fbb808ca-bcb9-4da9-a191-6f391de7158e {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2036.500517] env[67977]: DEBUG nova.network.neutron [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Successfully updated port: fbb808ca-bcb9-4da9-a191-6f391de7158e {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2036.514045] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "refresh_cache-1536ad10-129b-439d-80c5-08fa92aeaed1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2036.514210] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired lock "refresh_cache-1536ad10-129b-439d-80c5-08fa92aeaed1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2036.514407] env[67977]: DEBUG nova.network.neutron [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2036.569138] env[67977]: DEBUG nova.network.neutron [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2036.770322] env[67977]: DEBUG nova.network.neutron [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Updating instance_info_cache with network_info: [{"id": "fbb808ca-bcb9-4da9-a191-6f391de7158e", "address": "fa:16:3e:3a:2f:32", "network": {"id": "9f9704ef-f97e-4049-b46f-6c90efa33e6e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-765087939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "629b2265a2eb45128d27cb16a9e0304b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfbb808ca-bc", "ovs_interfaceid": "fbb808ca-bcb9-4da9-a191-6f391de7158e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2036.783421] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Releasing lock "refresh_cache-1536ad10-129b-439d-80c5-08fa92aeaed1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2036.783702] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Instance network_info: |[{"id": "fbb808ca-bcb9-4da9-a191-6f391de7158e", "address": "fa:16:3e:3a:2f:32", "network": {"id": "9f9704ef-f97e-4049-b46f-6c90efa33e6e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-765087939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "629b2265a2eb45128d27cb16a9e0304b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfbb808ca-bc", "ovs_interfaceid": "fbb808ca-bcb9-4da9-a191-6f391de7158e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2036.784090] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3a:2f:32', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '89ef02af-c508-432f-ae29-3a219701d584', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fbb808ca-bcb9-4da9-a191-6f391de7158e', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2036.791646] env[67977]: DEBUG oslo.service.loopingcall [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2036.792101] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2036.792347] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6f5c6269-9a51-44e1-b75f-4bb4edd7b7ef {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2036.812103] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2036.812103] env[67977]: value = "task-3468287" [ 2036.812103] env[67977]: _type = "Task" [ 2036.812103] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2036.821886] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468287, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2036.963964] env[67977]: DEBUG nova.compute.manager [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Received event network-vif-plugged-fbb808ca-bcb9-4da9-a191-6f391de7158e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2036.964247] env[67977]: DEBUG oslo_concurrency.lockutils [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] Acquiring lock "1536ad10-129b-439d-80c5-08fa92aeaed1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2036.964528] env[67977]: DEBUG oslo_concurrency.lockutils [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] Lock "1536ad10-129b-439d-80c5-08fa92aeaed1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2036.964759] env[67977]: DEBUG oslo_concurrency.lockutils [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] Lock "1536ad10-129b-439d-80c5-08fa92aeaed1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2036.964937] env[67977]: DEBUG nova.compute.manager [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] No waiting events found dispatching network-vif-plugged-fbb808ca-bcb9-4da9-a191-6f391de7158e {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2036.965346] env[67977]: WARNING nova.compute.manager [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Received unexpected event network-vif-plugged-fbb808ca-bcb9-4da9-a191-6f391de7158e for instance with vm_state building and task_state spawning. [ 2036.965527] env[67977]: DEBUG nova.compute.manager [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Received event network-changed-fbb808ca-bcb9-4da9-a191-6f391de7158e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2036.965706] env[67977]: DEBUG nova.compute.manager [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Refreshing instance network info cache due to event network-changed-fbb808ca-bcb9-4da9-a191-6f391de7158e. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2036.965909] env[67977]: DEBUG oslo_concurrency.lockutils [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] Acquiring lock "refresh_cache-1536ad10-129b-439d-80c5-08fa92aeaed1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2036.966085] env[67977]: DEBUG oslo_concurrency.lockutils [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] Acquired lock "refresh_cache-1536ad10-129b-439d-80c5-08fa92aeaed1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2036.966290] env[67977]: DEBUG nova.network.neutron [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Refreshing network info cache for port fbb808ca-bcb9-4da9-a191-6f391de7158e {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2037.233574] env[67977]: DEBUG nova.network.neutron [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Updated VIF entry in instance network info cache for port fbb808ca-bcb9-4da9-a191-6f391de7158e. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2037.233922] env[67977]: DEBUG nova.network.neutron [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Updating instance_info_cache with network_info: [{"id": "fbb808ca-bcb9-4da9-a191-6f391de7158e", "address": "fa:16:3e:3a:2f:32", "network": {"id": "9f9704ef-f97e-4049-b46f-6c90efa33e6e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-765087939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "629b2265a2eb45128d27cb16a9e0304b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfbb808ca-bc", "ovs_interfaceid": "fbb808ca-bcb9-4da9-a191-6f391de7158e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2037.243588] env[67977]: DEBUG oslo_concurrency.lockutils [req-1f61d2da-df45-4ee7-8a74-3929e864a689 req-c1c62208-c197-4003-a99e-19278b34efdb service nova] Releasing lock "refresh_cache-1536ad10-129b-439d-80c5-08fa92aeaed1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2037.322384] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468287, 'name': CreateVM_Task, 'duration_secs': 0.312444} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2037.322548] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2037.323222] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2037.323387] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2037.323717] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2037.323964] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8ed170bc-85ac-4c0f-9a75-a5e950056725 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2037.328568] env[67977]: DEBUG oslo_vmware.api [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 2037.328568] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52978149-cdfe-9c70-2eb5-c6440a621b04" [ 2037.328568] env[67977]: _type = "Task" [ 2037.328568] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2037.337092] env[67977]: DEBUG oslo_vmware.api [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52978149-cdfe-9c70-2eb5-c6440a621b04, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2037.839129] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2037.839410] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2037.839625] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2063.132924] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2064.777479] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2064.777816] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2065.775680] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2066.776101] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2066.776101] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2066.776541] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2067.772095] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2069.776622] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2069.777029] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2069.777029] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2069.798576] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.798771] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.798935] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.799206] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.799365] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.799490] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.799611] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.799732] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.799853] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.799973] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.800111] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2070.776066] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2070.787599] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2070.787899] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2070.787988] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2070.788160] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2070.789613] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-651f965a-c74d-4083-9102-034544f6a120 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.798277] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d73595df-50d1-42ab-86bd-8ec4d9041862 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.813285] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e02cbbbd-afee-43f2-96ab-6374d43794e4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.819502] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeefca71-e7e4-4997-88de-844ad970ea72 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.848739] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180900MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2070.848888] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2070.849091] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2070.920743] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.920957] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.921108] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.921233] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.921358] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.921478] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.921597] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.921721] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.921830] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.921947] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2070.922255] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2070.922447] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2071.038105] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bf3cd0b-2973-4d97-a457-323aca0a4e39 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.045609] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17859a26-a632-41de-8d4d-5b40a35eddc8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.075748] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8bb48b9-5949-4a88-ab58-a438c32b8b47 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.082695] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-059c5467-1386-475a-9607-3231f0cd50ca {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.095807] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2071.104524] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2071.119546] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2071.119735] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2083.250343] env[67977]: WARNING oslo_vmware.rw_handles [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2083.250343] env[67977]: ERROR oslo_vmware.rw_handles [ 2083.251260] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/6755e172-346f-4a2e-a3e8-5acd2716c28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2083.252794] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2083.252991] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Copying Virtual Disk [datastore1] vmware_temp/6755e172-346f-4a2e-a3e8-5acd2716c28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/6755e172-346f-4a2e-a3e8-5acd2716c28c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2083.253332] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8a890933-e9a6-4009-9bbc-22b29c62d517 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.261521] env[67977]: DEBUG oslo_vmware.api [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for the task: (returnval){ [ 2083.261521] env[67977]: value = "task-3468288" [ 2083.261521] env[67977]: _type = "Task" [ 2083.261521] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2083.269216] env[67977]: DEBUG oslo_vmware.api [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': task-3468288, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2083.772283] env[67977]: DEBUG oslo_vmware.exceptions [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2083.772588] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2083.773153] env[67977]: ERROR nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2083.773153] env[67977]: Faults: ['InvalidArgument'] [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Traceback (most recent call last): [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] yield resources [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] self.driver.spawn(context, instance, image_meta, [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] self._fetch_image_if_missing(context, vi) [ 2083.773153] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] image_cache(vi, tmp_image_ds_loc) [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] vm_util.copy_virtual_disk( [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] session._wait_for_task(vmdk_copy_task) [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] return self.wait_for_task(task_ref) [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] return evt.wait() [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] result = hub.switch() [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2083.773561] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] return self.greenlet.switch() [ 2083.773952] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2083.773952] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] self.f(*self.args, **self.kw) [ 2083.773952] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2083.773952] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] raise exceptions.translate_fault(task_info.error) [ 2083.773952] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2083.773952] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Faults: ['InvalidArgument'] [ 2083.773952] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] [ 2083.773952] env[67977]: INFO nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Terminating instance [ 2083.775093] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2083.775379] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2083.775607] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2bbba253-c69b-4e54-8c76-b75bd9c1aad1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.778069] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2083.778299] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2083.779068] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68a63964-a06e-491a-abf1-5910e2e85f02 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.785893] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2083.786117] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cc08bd15-2f6f-4671-a600-a9076d2811f6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.788405] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2083.788583] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2083.789580] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d4b3b8a4-38ef-4eae-b7f4-27fcc404a98b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.794327] env[67977]: DEBUG oslo_vmware.api [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for the task: (returnval){ [ 2083.794327] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52df67ca-3d81-1295-f307-4512040d9365" [ 2083.794327] env[67977]: _type = "Task" [ 2083.794327] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2083.808608] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2083.808835] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Creating directory with path [datastore1] vmware_temp/67852f61-509a-434f-ba71-c2bb0fed43e3/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2083.809062] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-62cd5390-9dde-4e50-9af1-18c6f9031993 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.830443] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Created directory with path [datastore1] vmware_temp/67852f61-509a-434f-ba71-c2bb0fed43e3/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2083.830785] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Fetch image to [datastore1] vmware_temp/67852f61-509a-434f-ba71-c2bb0fed43e3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2083.831103] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/67852f61-509a-434f-ba71-c2bb0fed43e3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2083.831971] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26bf58cb-8d16-496d-bcc0-8b48ec394d89 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.838807] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e14deea-0fec-4044-96da-54c03dd53170 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.847996] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-955bac37-510a-4588-9be8-5594a1597c10 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.854484] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2083.854701] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2083.854905] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Deleting the datastore file [datastore1] 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2083.880317] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5cdb3a5d-f1c8-4946-bd7a-34e75e5d212e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.882675] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6465e384-6a57-4787-92d9-201b5347b4a3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.889448] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5bc48242-220c-4215-a70e-a580ca34186a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.891113] env[67977]: DEBUG oslo_vmware.api [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for the task: (returnval){ [ 2083.891113] env[67977]: value = "task-3468290" [ 2083.891113] env[67977]: _type = "Task" [ 2083.891113] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2083.898515] env[67977]: DEBUG oslo_vmware.api [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': task-3468290, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2083.919612] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2084.042115] env[67977]: DEBUG oslo_vmware.rw_handles [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/67852f61-509a-434f-ba71-c2bb0fed43e3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2084.100116] env[67977]: DEBUG oslo_vmware.rw_handles [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2084.100328] env[67977]: DEBUG oslo_vmware.rw_handles [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/67852f61-509a-434f-ba71-c2bb0fed43e3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2084.401517] env[67977]: DEBUG oslo_vmware.api [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Task: {'id': task-3468290, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07034} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2084.401892] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2084.401892] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2084.402067] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2084.402246] env[67977]: INFO nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2084.404303] env[67977]: DEBUG nova.compute.claims [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2084.404471] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2084.404682] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2084.643571] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8315695-c462-4bdd-8b28-055880cd68ca {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2084.651143] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db904506-9082-4036-935c-3c28d4b21f4f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2084.680757] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4681604-0a48-475d-94e2-0f3e214338b9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2084.687394] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-755548f5-3053-4212-8aeb-43cdd0b650ba {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2084.700928] env[67977]: DEBUG nova.compute.provider_tree [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2084.709439] env[67977]: DEBUG nova.scheduler.client.report [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2084.737231] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.332s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2084.737765] env[67977]: ERROR nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2084.737765] env[67977]: Faults: ['InvalidArgument'] [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Traceback (most recent call last): [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] self.driver.spawn(context, instance, image_meta, [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] self._fetch_image_if_missing(context, vi) [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] image_cache(vi, tmp_image_ds_loc) [ 2084.737765] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] vm_util.copy_virtual_disk( [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] session._wait_for_task(vmdk_copy_task) [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] return self.wait_for_task(task_ref) [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] return evt.wait() [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] result = hub.switch() [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] return self.greenlet.switch() [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2084.738177] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] self.f(*self.args, **self.kw) [ 2084.738636] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2084.738636] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] raise exceptions.translate_fault(task_info.error) [ 2084.738636] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2084.738636] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Faults: ['InvalidArgument'] [ 2084.738636] env[67977]: ERROR nova.compute.manager [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] [ 2084.738636] env[67977]: DEBUG nova.compute.utils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2084.739897] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Build of instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa was re-scheduled: A specified parameter was not correct: fileType [ 2084.739897] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2084.740278] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2084.740447] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2084.740796] env[67977]: DEBUG nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2084.740796] env[67977]: DEBUG nova.network.neutron [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2085.280667] env[67977]: DEBUG nova.network.neutron [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2085.294849] env[67977]: INFO nova.compute.manager [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Took 0.55 seconds to deallocate network for instance. [ 2085.402756] env[67977]: INFO nova.scheduler.client.report [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Deleted allocations for instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa [ 2085.446871] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4c2a330e-fc66-4f3b-9174-289822381e31 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 622.035s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2085.447130] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 425.388s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2085.447354] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Acquiring lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2085.447565] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2085.447731] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2085.452689] env[67977]: INFO nova.compute.manager [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Terminating instance [ 2085.452689] env[67977]: DEBUG nova.compute.manager [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2085.452689] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2085.452689] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6118e238-bb6b-474c-8dde-e159e033868b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2085.460385] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f213dd51-7a75-44e3-b4b9-cc98c2ee0661 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2085.487753] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa could not be found. [ 2085.487940] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2085.488128] env[67977]: INFO nova.compute.manager [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2085.488357] env[67977]: DEBUG oslo.service.loopingcall [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2085.488583] env[67977]: DEBUG nova.compute.manager [-] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2085.488679] env[67977]: DEBUG nova.network.neutron [-] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2085.511995] env[67977]: DEBUG nova.network.neutron [-] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2085.519444] env[67977]: INFO nova.compute.manager [-] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] Took 0.03 seconds to deallocate network for instance. [ 2085.647488] env[67977]: DEBUG oslo_concurrency.lockutils [None req-bf26aa1b-3cca-4757-8bd6-0a87885310c9 tempest-AttachVolumeTestJSON-1708738584 tempest-AttachVolumeTestJSON-1708738584-project-member] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.200s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2085.648458] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 180.922s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2085.648694] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa] During sync_power_state the instance has a pending task (deleting). Skip. [ 2085.648904] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "8e81c715-9cb3-4a4d-afd7-5fd5d71ac0aa" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.940111] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2124.121027] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2124.775599] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2125.771028] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2125.793016] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2125.793016] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2127.776491] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2127.776964] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2127.776964] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2129.771653] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2130.775583] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2130.775875] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2130.775875] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2130.796356] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2130.796554] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2130.796725] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2130.796882] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2130.797044] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2130.797212] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2130.797378] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2130.797513] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2130.797668] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2130.797810] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2132.735141] env[67977]: WARNING oslo_vmware.rw_handles [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2132.735141] env[67977]: ERROR oslo_vmware.rw_handles [ 2132.736505] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/67852f61-509a-434f-ba71-c2bb0fed43e3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2132.737747] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2132.738016] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Copying Virtual Disk [datastore1] vmware_temp/67852f61-509a-434f-ba71-c2bb0fed43e3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/67852f61-509a-434f-ba71-c2bb0fed43e3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2132.738358] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dfa2ce98-2a48-4278-9ae7-a8064e0fe391 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2132.745674] env[67977]: DEBUG oslo_vmware.api [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for the task: (returnval){ [ 2132.745674] env[67977]: value = "task-3468291" [ 2132.745674] env[67977]: _type = "Task" [ 2132.745674] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2132.753880] env[67977]: DEBUG oslo_vmware.api [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Task: {'id': task-3468291, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2132.775452] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2132.787384] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2132.787549] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2132.787647] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2132.787753] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2132.788842] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-551a3e22-6e3f-47a4-a557-09831cb3c9d9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2132.796772] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f1bf781-578d-478a-bd55-83ac2effb260 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2132.810390] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-170fcf13-2a71-45f7-abd7-7dd797f9e7cc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2132.816625] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9cac343-9cd0-4a1b-9373-3f33a5dbcb31 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2132.846046] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180908MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2132.846192] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2132.846381] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2132.913229] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d1fc2ae5-fa11-41a7-808b-13da16667078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2132.913446] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 511896d4-d9cb-42e0-b213-31be3cac191c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2132.913618] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2132.913764] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2132.913915] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2132.914071] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2132.914204] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2132.914326] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2132.914441] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2132.914625] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2132.914761] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2133.019598] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6db396f1-8b0b-4d01-8c97-b6e7347875e8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.027346] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-939ad41a-c0cf-4da7-a16a-eb8f3f1cfd32 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.057113] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23998a54-9a1b-4c3e-ac24-586851a9999e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.064365] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c9e146d-df07-457f-a302-42db5c0ac6ad {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.078808] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2133.087051] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2133.100602] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2133.100778] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2133.254843] env[67977]: DEBUG oslo_vmware.exceptions [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2133.255234] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2133.255793] env[67977]: ERROR nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2133.255793] env[67977]: Faults: ['InvalidArgument'] [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Traceback (most recent call last): [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] yield resources [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] self.driver.spawn(context, instance, image_meta, [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] self._fetch_image_if_missing(context, vi) [ 2133.255793] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] image_cache(vi, tmp_image_ds_loc) [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] vm_util.copy_virtual_disk( [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] session._wait_for_task(vmdk_copy_task) [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] return self.wait_for_task(task_ref) [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] return evt.wait() [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] result = hub.switch() [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2133.256246] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] return self.greenlet.switch() [ 2133.256701] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2133.256701] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] self.f(*self.args, **self.kw) [ 2133.256701] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2133.256701] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] raise exceptions.translate_fault(task_info.error) [ 2133.256701] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2133.256701] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Faults: ['InvalidArgument'] [ 2133.256701] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] [ 2133.256701] env[67977]: INFO nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Terminating instance [ 2133.257597] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2133.257799] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2133.258039] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6f78772e-d6bc-4853-81ad-25181fb15a2f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.260283] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2133.260475] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2133.261185] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1756fe41-496e-4076-96dd-896048acb134 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.267781] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2133.267986] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-57c541e5-45e6-430a-881b-7baad2c460ac {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.270074] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2133.270252] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2133.271176] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aec7863e-d073-459b-a23e-c3569938069e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.276023] env[67977]: DEBUG oslo_vmware.api [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 2133.276023] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52b7073f-92c0-4d2a-ea09-236e02529ce4" [ 2133.276023] env[67977]: _type = "Task" [ 2133.276023] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2133.287167] env[67977]: DEBUG oslo_vmware.api [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52b7073f-92c0-4d2a-ea09-236e02529ce4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2133.330655] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2133.330831] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2133.331015] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Deleting the datastore file [datastore1] d1fc2ae5-fa11-41a7-808b-13da16667078 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2133.331297] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ff6a083d-e77a-41aa-a07f-ec8d7e35ee0b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.338426] env[67977]: DEBUG oslo_vmware.api [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for the task: (returnval){ [ 2133.338426] env[67977]: value = "task-3468293" [ 2133.338426] env[67977]: _type = "Task" [ 2133.338426] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2133.345769] env[67977]: DEBUG oslo_vmware.api [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Task: {'id': task-3468293, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2133.785899] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2133.786247] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating directory with path [datastore1] vmware_temp/06a86dcd-0b75-4d91-924c-e6eb141fb996/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2133.786472] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f63baffb-3d3a-4161-b047-9c1a7fa8fa07 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.797332] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created directory with path [datastore1] vmware_temp/06a86dcd-0b75-4d91-924c-e6eb141fb996/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2133.797558] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Fetch image to [datastore1] vmware_temp/06a86dcd-0b75-4d91-924c-e6eb141fb996/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2133.797794] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/06a86dcd-0b75-4d91-924c-e6eb141fb996/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2133.798551] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-630a31c2-ee06-4c0e-adb9-bd5aa051c28f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.804996] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccf96791-da70-4279-a466-3a7409fa343f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.813977] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-494475a0-bd06-469e-9b49-2e7f378d28c4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.848172] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-900d75a4-256e-45e0-aca1-5536d5455201 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.854877] env[67977]: DEBUG oslo_vmware.api [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Task: {'id': task-3468293, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074021} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2133.856344] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2133.856537] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2133.856708] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2133.856881] env[67977]: INFO nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2133.858618] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c8c47fc9-9104-44cc-ac80-c13aab083de8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2133.860514] env[67977]: DEBUG nova.compute.claims [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2133.860711] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2133.860924] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2133.885456] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2133.936730] env[67977]: DEBUG oslo_vmware.rw_handles [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/06a86dcd-0b75-4d91-924c-e6eb141fb996/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2133.995900] env[67977]: DEBUG oslo_vmware.rw_handles [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2133.996224] env[67977]: DEBUG oslo_vmware.rw_handles [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/06a86dcd-0b75-4d91-924c-e6eb141fb996/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2134.062590] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef7e0422-ed0d-4a9e-aa0d-8fbb9d1e0d46 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.070336] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c32bdfba-e16d-4e3a-9b02-6c811876210f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.100324] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa81ad2b-1631-4652-9a43-1892d564186a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.107562] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a503951c-2dc4-44c4-9ad5-4dee08fe8808 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.120466] env[67977]: DEBUG nova.compute.provider_tree [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2134.129098] env[67977]: DEBUG nova.scheduler.client.report [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2134.141491] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.280s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2134.142041] env[67977]: ERROR nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2134.142041] env[67977]: Faults: ['InvalidArgument'] [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Traceback (most recent call last): [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] self.driver.spawn(context, instance, image_meta, [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] self._fetch_image_if_missing(context, vi) [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] image_cache(vi, tmp_image_ds_loc) [ 2134.142041] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] vm_util.copy_virtual_disk( [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] session._wait_for_task(vmdk_copy_task) [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] return self.wait_for_task(task_ref) [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] return evt.wait() [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] result = hub.switch() [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] return self.greenlet.switch() [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2134.142523] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] self.f(*self.args, **self.kw) [ 2134.142933] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2134.142933] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] raise exceptions.translate_fault(task_info.error) [ 2134.142933] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2134.142933] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Faults: ['InvalidArgument'] [ 2134.142933] env[67977]: ERROR nova.compute.manager [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] [ 2134.142933] env[67977]: DEBUG nova.compute.utils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2134.144090] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Build of instance d1fc2ae5-fa11-41a7-808b-13da16667078 was re-scheduled: A specified parameter was not correct: fileType [ 2134.144090] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2134.144462] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2134.144633] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2134.144803] env[67977]: DEBUG nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2134.144965] env[67977]: DEBUG nova.network.neutron [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2134.481136] env[67977]: DEBUG nova.network.neutron [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2134.492404] env[67977]: INFO nova.compute.manager [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Took 0.35 seconds to deallocate network for instance. [ 2134.589544] env[67977]: INFO nova.scheduler.client.report [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Deleted allocations for instance d1fc2ae5-fa11-41a7-808b-13da16667078 [ 2134.609732] env[67977]: DEBUG oslo_concurrency.lockutils [None req-64426db1-46ce-46e8-b7dd-3410556afb27 tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 624.927s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2134.609998] env[67977]: DEBUG oslo_concurrency.lockutils [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 429.016s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2134.610242] env[67977]: DEBUG oslo_concurrency.lockutils [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Acquiring lock "d1fc2ae5-fa11-41a7-808b-13da16667078-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2134.610452] env[67977]: DEBUG oslo_concurrency.lockutils [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2134.610616] env[67977]: DEBUG oslo_concurrency.lockutils [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2134.612581] env[67977]: INFO nova.compute.manager [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Terminating instance [ 2134.614340] env[67977]: DEBUG nova.compute.manager [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2134.614536] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2134.615118] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-61cdd23b-9e84-4c68-8897-faf1682afdf6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.624576] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b82209d-d066-40ae-b6d7-2ebe77101aa8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.657271] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d1fc2ae5-fa11-41a7-808b-13da16667078 could not be found. [ 2134.657395] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2134.657528] env[67977]: INFO nova.compute.manager [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2134.657775] env[67977]: DEBUG oslo.service.loopingcall [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2134.658056] env[67977]: DEBUG nova.compute.manager [-] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2134.658133] env[67977]: DEBUG nova.network.neutron [-] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2134.686218] env[67977]: DEBUG nova.network.neutron [-] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2134.694637] env[67977]: INFO nova.compute.manager [-] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] Took 0.04 seconds to deallocate network for instance. [ 2134.780114] env[67977]: DEBUG oslo_concurrency.lockutils [None req-525bcd3d-344a-46cd-82b7-8de1413b001e tempest-SecurityGroupsTestJSON-163711361 tempest-SecurityGroupsTestJSON-163711361-project-member] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.170s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2134.780941] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 230.055s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2134.781140] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d1fc2ae5-fa11-41a7-808b-13da16667078] During sync_power_state the instance has a pending task (deleting). Skip. [ 2134.781325] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "d1fc2ae5-fa11-41a7-808b-13da16667078" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2148.177568] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "142d3b29-b467-4007-84ac-8b7e0ee9e326" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2152.100244] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "396fd258-dc89-4392-a487-921958012e92" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2152.100560] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "396fd258-dc89-4392-a487-921958012e92" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2152.125168] env[67977]: DEBUG nova.compute.manager [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2152.175993] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2152.176283] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2152.177691] env[67977]: INFO nova.compute.claims [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2152.320431] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc51422c-474a-439f-837b-10d679233725 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.328223] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7fa7e06-3be1-4780-a239-f7347d061b17 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.358779] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec0c1f34-5e95-447b-a67d-26dadd664abd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.366042] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d188ea-0b5c-4c88-b509-29ed46b38737 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.378981] env[67977]: DEBUG nova.compute.provider_tree [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2152.388894] env[67977]: DEBUG nova.scheduler.client.report [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2152.401239] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.225s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2152.401687] env[67977]: DEBUG nova.compute.manager [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2152.435312] env[67977]: DEBUG nova.compute.utils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2152.436442] env[67977]: DEBUG nova.compute.manager [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2152.436607] env[67977]: DEBUG nova.network.neutron [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2152.444190] env[67977]: DEBUG nova.compute.manager [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2152.506648] env[67977]: DEBUG nova.compute.manager [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2152.522527] env[67977]: DEBUG nova.policy [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd76b3cc7fe2143dabe6ab02906a25097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '52e6b27298274fa1a10d95d9a967814b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 2152.532081] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2152.532330] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2152.532488] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2152.532673] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2152.532829] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2152.533064] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2152.533310] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2152.533477] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2152.533646] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2152.533808] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2152.533976] env[67977]: DEBUG nova.virt.hardware [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2152.534843] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8ce04a1-b9f4-4693-a4d9-54c6bbcbba50 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.542849] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51cb1266-1704-41dd-8c2b-2cf141d45b06 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.906795] env[67977]: DEBUG nova.network.neutron [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Successfully created port: d377dc7c-33cb-4f6c-90f7-53abc689777e {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2153.412666] env[67977]: DEBUG nova.compute.manager [req-60f8f536-3d72-462a-a535-374036b3e46e req-add6f12c-6891-46cf-8b22-1198f8599e9e service nova] [instance: 396fd258-dc89-4392-a487-921958012e92] Received event network-vif-plugged-d377dc7c-33cb-4f6c-90f7-53abc689777e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2153.412940] env[67977]: DEBUG oslo_concurrency.lockutils [req-60f8f536-3d72-462a-a535-374036b3e46e req-add6f12c-6891-46cf-8b22-1198f8599e9e service nova] Acquiring lock "396fd258-dc89-4392-a487-921958012e92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2153.413115] env[67977]: DEBUG oslo_concurrency.lockutils [req-60f8f536-3d72-462a-a535-374036b3e46e req-add6f12c-6891-46cf-8b22-1198f8599e9e service nova] Lock "396fd258-dc89-4392-a487-921958012e92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2153.413458] env[67977]: DEBUG oslo_concurrency.lockutils [req-60f8f536-3d72-462a-a535-374036b3e46e req-add6f12c-6891-46cf-8b22-1198f8599e9e service nova] Lock "396fd258-dc89-4392-a487-921958012e92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2153.413458] env[67977]: DEBUG nova.compute.manager [req-60f8f536-3d72-462a-a535-374036b3e46e req-add6f12c-6891-46cf-8b22-1198f8599e9e service nova] [instance: 396fd258-dc89-4392-a487-921958012e92] No waiting events found dispatching network-vif-plugged-d377dc7c-33cb-4f6c-90f7-53abc689777e {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2153.413611] env[67977]: WARNING nova.compute.manager [req-60f8f536-3d72-462a-a535-374036b3e46e req-add6f12c-6891-46cf-8b22-1198f8599e9e service nova] [instance: 396fd258-dc89-4392-a487-921958012e92] Received unexpected event network-vif-plugged-d377dc7c-33cb-4f6c-90f7-53abc689777e for instance with vm_state building and task_state spawning. [ 2153.484653] env[67977]: DEBUG nova.network.neutron [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Successfully updated port: d377dc7c-33cb-4f6c-90f7-53abc689777e {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2153.496239] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "refresh_cache-396fd258-dc89-4392-a487-921958012e92" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2153.496389] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired lock "refresh_cache-396fd258-dc89-4392-a487-921958012e92" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2153.496711] env[67977]: DEBUG nova.network.neutron [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2153.533742] env[67977]: DEBUG nova.network.neutron [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2153.728359] env[67977]: DEBUG nova.network.neutron [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Updating instance_info_cache with network_info: [{"id": "d377dc7c-33cb-4f6c-90f7-53abc689777e", "address": "fa:16:3e:44:1c:8b", "network": {"id": "4eece86d-f584-415a-bcdd-5bde739283be", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1877932954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "52e6b27298274fa1a10d95d9a967814b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "975b168a-03e5-449d-95ac-4d51ba027242", "external-id": "nsx-vlan-transportzone-365", "segmentation_id": 365, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd377dc7c-33", "ovs_interfaceid": "d377dc7c-33cb-4f6c-90f7-53abc689777e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2153.741792] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Releasing lock "refresh_cache-396fd258-dc89-4392-a487-921958012e92" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2153.741992] env[67977]: DEBUG nova.compute.manager [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Instance network_info: |[{"id": "d377dc7c-33cb-4f6c-90f7-53abc689777e", "address": "fa:16:3e:44:1c:8b", "network": {"id": "4eece86d-f584-415a-bcdd-5bde739283be", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1877932954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "52e6b27298274fa1a10d95d9a967814b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "975b168a-03e5-449d-95ac-4d51ba027242", "external-id": "nsx-vlan-transportzone-365", "segmentation_id": 365, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd377dc7c-33", "ovs_interfaceid": "d377dc7c-33cb-4f6c-90f7-53abc689777e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2153.742398] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:44:1c:8b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '975b168a-03e5-449d-95ac-4d51ba027242', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd377dc7c-33cb-4f6c-90f7-53abc689777e', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2153.750284] env[67977]: DEBUG oslo.service.loopingcall [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2153.750944] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 396fd258-dc89-4392-a487-921958012e92] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2153.751122] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a21adf83-f6d2-4c9c-ad50-bca429bda789 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.772035] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2153.772035] env[67977]: value = "task-3468294" [ 2153.772035] env[67977]: _type = "Task" [ 2153.772035] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2153.780588] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468294, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2154.282248] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468294, 'name': CreateVM_Task, 'duration_secs': 0.28702} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2154.282417] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 396fd258-dc89-4392-a487-921958012e92] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2154.283115] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2154.283298] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2154.283625] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2154.283876] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f0f0b665-4481-4811-9f68-4a11d5f3ce49 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2154.288086] env[67977]: DEBUG oslo_vmware.api [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 2154.288086] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52912d6d-cd10-d81e-6f30-2cf0fa48e534" [ 2154.288086] env[67977]: _type = "Task" [ 2154.288086] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2154.295209] env[67977]: DEBUG oslo_vmware.api [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52912d6d-cd10-d81e-6f30-2cf0fa48e534, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2154.798940] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2154.799233] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2154.799428] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2155.444039] env[67977]: DEBUG nova.compute.manager [req-88098bb9-83e5-4d99-84e9-bfc62bc26d77 req-fee4eb81-33f6-4ea5-b362-084cdc10087a service nova] [instance: 396fd258-dc89-4392-a487-921958012e92] Received event network-changed-d377dc7c-33cb-4f6c-90f7-53abc689777e {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2155.444270] env[67977]: DEBUG nova.compute.manager [req-88098bb9-83e5-4d99-84e9-bfc62bc26d77 req-fee4eb81-33f6-4ea5-b362-084cdc10087a service nova] [instance: 396fd258-dc89-4392-a487-921958012e92] Refreshing instance network info cache due to event network-changed-d377dc7c-33cb-4f6c-90f7-53abc689777e. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2155.444481] env[67977]: DEBUG oslo_concurrency.lockutils [req-88098bb9-83e5-4d99-84e9-bfc62bc26d77 req-fee4eb81-33f6-4ea5-b362-084cdc10087a service nova] Acquiring lock "refresh_cache-396fd258-dc89-4392-a487-921958012e92" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2155.444624] env[67977]: DEBUG oslo_concurrency.lockutils [req-88098bb9-83e5-4d99-84e9-bfc62bc26d77 req-fee4eb81-33f6-4ea5-b362-084cdc10087a service nova] Acquired lock "refresh_cache-396fd258-dc89-4392-a487-921958012e92" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2155.444786] env[67977]: DEBUG nova.network.neutron [req-88098bb9-83e5-4d99-84e9-bfc62bc26d77 req-fee4eb81-33f6-4ea5-b362-084cdc10087a service nova] [instance: 396fd258-dc89-4392-a487-921958012e92] Refreshing network info cache for port d377dc7c-33cb-4f6c-90f7-53abc689777e {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2155.676399] env[67977]: DEBUG nova.network.neutron [req-88098bb9-83e5-4d99-84e9-bfc62bc26d77 req-fee4eb81-33f6-4ea5-b362-084cdc10087a service nova] [instance: 396fd258-dc89-4392-a487-921958012e92] Updated VIF entry in instance network info cache for port d377dc7c-33cb-4f6c-90f7-53abc689777e. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2155.676742] env[67977]: DEBUG nova.network.neutron [req-88098bb9-83e5-4d99-84e9-bfc62bc26d77 req-fee4eb81-33f6-4ea5-b362-084cdc10087a service nova] [instance: 396fd258-dc89-4392-a487-921958012e92] Updating instance_info_cache with network_info: [{"id": "d377dc7c-33cb-4f6c-90f7-53abc689777e", "address": "fa:16:3e:44:1c:8b", "network": {"id": "4eece86d-f584-415a-bcdd-5bde739283be", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1877932954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "52e6b27298274fa1a10d95d9a967814b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "975b168a-03e5-449d-95ac-4d51ba027242", "external-id": "nsx-vlan-transportzone-365", "segmentation_id": 365, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd377dc7c-33", "ovs_interfaceid": "d377dc7c-33cb-4f6c-90f7-53abc689777e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2155.686114] env[67977]: DEBUG oslo_concurrency.lockutils [req-88098bb9-83e5-4d99-84e9-bfc62bc26d77 req-fee4eb81-33f6-4ea5-b362-084cdc10087a service nova] Releasing lock "refresh_cache-396fd258-dc89-4392-a487-921958012e92" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2179.499013] env[67977]: WARNING oslo_vmware.rw_handles [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2179.499013] env[67977]: ERROR oslo_vmware.rw_handles [ 2179.499694] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/06a86dcd-0b75-4d91-924c-e6eb141fb996/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2179.501407] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2179.501762] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Copying Virtual Disk [datastore1] vmware_temp/06a86dcd-0b75-4d91-924c-e6eb141fb996/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/06a86dcd-0b75-4d91-924c-e6eb141fb996/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2179.501895] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2b796db8-c14c-48dc-adcf-3587c35ffb60 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.510175] env[67977]: DEBUG oslo_vmware.api [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 2179.510175] env[67977]: value = "task-3468295" [ 2179.510175] env[67977]: _type = "Task" [ 2179.510175] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2179.517773] env[67977]: DEBUG oslo_vmware.api [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468295, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2180.022086] env[67977]: DEBUG oslo_vmware.exceptions [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2180.022086] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2180.022086] env[67977]: ERROR nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2180.022086] env[67977]: Faults: ['InvalidArgument'] [ 2180.022086] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Traceback (most recent call last): [ 2180.022086] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2180.022086] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] yield resources [ 2180.022086] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2180.022086] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] self.driver.spawn(context, instance, image_meta, [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] self._fetch_image_if_missing(context, vi) [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] image_cache(vi, tmp_image_ds_loc) [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] vm_util.copy_virtual_disk( [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] session._wait_for_task(vmdk_copy_task) [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] return self.wait_for_task(task_ref) [ 2180.022585] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] return evt.wait() [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] result = hub.switch() [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] return self.greenlet.switch() [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] self.f(*self.args, **self.kw) [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] raise exceptions.translate_fault(task_info.error) [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Faults: ['InvalidArgument'] [ 2180.022998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] [ 2180.023407] env[67977]: INFO nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Terminating instance [ 2180.023554] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2180.023775] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2180.024022] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6fa7172e-a6ad-4121-b0dd-899a29e21cd4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.026241] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2180.026433] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2180.027156] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16aef4eb-8bc1-4603-b477-4d638ec6fbca {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.033744] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2180.033951] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2bf9517f-3893-4df9-b093-43fc48050489 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.036069] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2180.036246] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2180.037163] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cec080c2-966f-4c03-b90c-eac3939507c7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.041637] env[67977]: DEBUG oslo_vmware.api [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 2180.041637] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52837889-1ed0-1c0f-a436-2db3d841e569" [ 2180.041637] env[67977]: _type = "Task" [ 2180.041637] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2180.052793] env[67977]: DEBUG oslo_vmware.api [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52837889-1ed0-1c0f-a436-2db3d841e569, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2180.095822] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2180.096086] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2180.096281] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleting the datastore file [datastore1] 511896d4-d9cb-42e0-b213-31be3cac191c {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2180.096543] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a5a264d9-e580-4af9-b7eb-5fec23661bcd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.102220] env[67977]: DEBUG oslo_vmware.api [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 2180.102220] env[67977]: value = "task-3468297" [ 2180.102220] env[67977]: _type = "Task" [ 2180.102220] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2180.109505] env[67977]: DEBUG oslo_vmware.api [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468297, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2180.552257] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2180.552611] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating directory with path [datastore1] vmware_temp/1d586b62-b117-4a41-a9fe-a483f665581f/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2180.552730] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5af4cf87-3017-4732-a831-4aebbd8bc490 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.563614] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Created directory with path [datastore1] vmware_temp/1d586b62-b117-4a41-a9fe-a483f665581f/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2180.563803] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Fetch image to [datastore1] vmware_temp/1d586b62-b117-4a41-a9fe-a483f665581f/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2180.563970] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/1d586b62-b117-4a41-a9fe-a483f665581f/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2180.564686] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bf3afde-bb4f-4ec2-b204-a1484df61524 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.570768] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4294bbc4-ebf9-4a7b-8fc0-d75dbb068cc5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.579381] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-468d5400-3818-4502-ab24-e9799509cfa0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.612938] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e856866f-d0e2-4e13-a0ca-375bff395099 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.619677] env[67977]: DEBUG oslo_vmware.api [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468297, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076879} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2180.621086] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2180.621410] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2180.621501] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2180.621604] env[67977]: INFO nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2180.623315] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-59be7751-0dbd-4a72-b8e5-71ca5069218c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.625155] env[67977]: DEBUG nova.compute.claims [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2180.625329] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2180.625540] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2180.648767] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2180.700993] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d586b62-b117-4a41-a9fe-a483f665581f/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2180.758951] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2180.759154] env[67977]: DEBUG oslo_vmware.rw_handles [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d586b62-b117-4a41-a9fe-a483f665581f/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2180.835413] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-817ee05c-41fc-478d-8341-0e9164afed16 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.842621] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b37d3950-13e3-42b0-bb8d-aa25fb9882d0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.871432] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc07539a-d20b-4ef7-b92e-e2fe44359c9a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.878404] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f16817e3-0b82-4460-b44b-8f1517cda60f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.891999] env[67977]: DEBUG nova.compute.provider_tree [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2180.900320] env[67977]: DEBUG nova.scheduler.client.report [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2180.913113] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.287s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2180.913636] env[67977]: ERROR nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2180.913636] env[67977]: Faults: ['InvalidArgument'] [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Traceback (most recent call last): [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] self.driver.spawn(context, instance, image_meta, [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] self._fetch_image_if_missing(context, vi) [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] image_cache(vi, tmp_image_ds_loc) [ 2180.913636] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] vm_util.copy_virtual_disk( [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] session._wait_for_task(vmdk_copy_task) [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] return self.wait_for_task(task_ref) [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] return evt.wait() [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] result = hub.switch() [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] return self.greenlet.switch() [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2180.913998] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] self.f(*self.args, **self.kw) [ 2180.914363] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2180.914363] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] raise exceptions.translate_fault(task_info.error) [ 2180.914363] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2180.914363] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Faults: ['InvalidArgument'] [ 2180.914363] env[67977]: ERROR nova.compute.manager [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] [ 2180.914363] env[67977]: DEBUG nova.compute.utils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2180.915653] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Build of instance 511896d4-d9cb-42e0-b213-31be3cac191c was re-scheduled: A specified parameter was not correct: fileType [ 2180.915653] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2180.916032] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2180.916213] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2180.916410] env[67977]: DEBUG nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2180.916539] env[67977]: DEBUG nova.network.neutron [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2181.228288] env[67977]: DEBUG nova.network.neutron [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2181.240503] env[67977]: INFO nova.compute.manager [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Took 0.32 seconds to deallocate network for instance. [ 2181.345059] env[67977]: INFO nova.scheduler.client.report [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleted allocations for instance 511896d4-d9cb-42e0-b213-31be3cac191c [ 2181.367032] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fce8b1f5-5b0b-48ea-bc2f-fd3a9b491f28 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "511896d4-d9cb-42e0-b213-31be3cac191c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 623.594s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2181.367220] env[67977]: DEBUG oslo_concurrency.lockutils [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "511896d4-d9cb-42e0-b213-31be3cac191c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 427.118s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2181.367435] env[67977]: DEBUG oslo_concurrency.lockutils [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "511896d4-d9cb-42e0-b213-31be3cac191c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2181.367658] env[67977]: DEBUG oslo_concurrency.lockutils [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "511896d4-d9cb-42e0-b213-31be3cac191c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2181.367827] env[67977]: DEBUG oslo_concurrency.lockutils [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "511896d4-d9cb-42e0-b213-31be3cac191c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2181.369721] env[67977]: INFO nova.compute.manager [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Terminating instance [ 2181.371421] env[67977]: DEBUG nova.compute.manager [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2181.371614] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2181.372081] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b9835090-2f7e-47ac-9f94-95939d92fe94 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2181.381475] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-104dd925-7094-4992-97b5-cf44b7848b26 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2181.408902] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 511896d4-d9cb-42e0-b213-31be3cac191c could not be found. [ 2181.409119] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2181.409373] env[67977]: INFO nova.compute.manager [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2181.409626] env[67977]: DEBUG oslo.service.loopingcall [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2181.409838] env[67977]: DEBUG nova.compute.manager [-] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2181.409937] env[67977]: DEBUG nova.network.neutron [-] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2181.433032] env[67977]: DEBUG nova.network.neutron [-] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2181.441880] env[67977]: INFO nova.compute.manager [-] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] Took 0.03 seconds to deallocate network for instance. [ 2181.529424] env[67977]: DEBUG oslo_concurrency.lockutils [None req-798cba0c-3d70-456f-b12f-9df650f316f1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "511896d4-d9cb-42e0-b213-31be3cac191c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2181.530330] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "511896d4-d9cb-42e0-b213-31be3cac191c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 276.804s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2181.530682] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 511896d4-d9cb-42e0-b213-31be3cac191c] During sync_power_state the instance has a pending task (deleting). Skip. [ 2181.530766] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "511896d4-d9cb-42e0-b213-31be3cac191c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2181.881702] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "19f06190-ebc3-4089-81d4-1ac09ec46b46" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2181.882011] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "19f06190-ebc3-4089-81d4-1ac09ec46b46" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2181.892301] env[67977]: DEBUG nova.compute.manager [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2181.939042] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2181.939291] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2181.940647] env[67977]: INFO nova.compute.claims [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2182.089351] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fb265ec-4fbd-407f-a7cd-f9361321cce6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2182.097842] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2a1e595-e5aa-45fa-af74-c1a1deb805a8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2182.126938] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d69cdf68-b804-473b-a60b-4fa342638897 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2182.134073] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e40a416-5e0d-4fe5-a74a-f57a7197c38e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2182.147195] env[67977]: DEBUG nova.compute.provider_tree [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2182.155934] env[67977]: DEBUG nova.scheduler.client.report [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2182.168800] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.229s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2182.169258] env[67977]: DEBUG nova.compute.manager [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2182.206170] env[67977]: DEBUG nova.compute.utils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2182.207465] env[67977]: DEBUG nova.compute.manager [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2182.207564] env[67977]: DEBUG nova.network.neutron [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2182.216748] env[67977]: DEBUG nova.compute.manager [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2182.271629] env[67977]: DEBUG nova.policy [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4965d451810c48458246493019d83172', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d528c04bd83409eb74e20393651c040', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 2182.274569] env[67977]: DEBUG nova.compute.manager [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2182.299221] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2182.299449] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2182.299607] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2182.299813] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2182.300017] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2182.300193] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2182.300401] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2182.300557] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2182.300720] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2182.300878] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2182.301059] env[67977]: DEBUG nova.virt.hardware [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2182.301890] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e59d112-b3d9-4cb4-bc4c-ce38a5680d9f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2182.309787] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de3cee53-dfce-4de3-9fcd-9102fffb5571 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2182.563582] env[67977]: DEBUG nova.network.neutron [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Successfully created port: 8fcb546a-7bec-4795-9980-7bb28cc01ad0 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2183.387467] env[67977]: DEBUG nova.compute.manager [req-9b7cbb70-3c7a-450e-881e-f5a1c304ab7b req-40dd3b77-07e8-4fc9-89c3-b3c1ceb8763d service nova] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Received event network-vif-plugged-8fcb546a-7bec-4795-9980-7bb28cc01ad0 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2183.387743] env[67977]: DEBUG oslo_concurrency.lockutils [req-9b7cbb70-3c7a-450e-881e-f5a1c304ab7b req-40dd3b77-07e8-4fc9-89c3-b3c1ceb8763d service nova] Acquiring lock "19f06190-ebc3-4089-81d4-1ac09ec46b46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2183.387881] env[67977]: DEBUG oslo_concurrency.lockutils [req-9b7cbb70-3c7a-450e-881e-f5a1c304ab7b req-40dd3b77-07e8-4fc9-89c3-b3c1ceb8763d service nova] Lock "19f06190-ebc3-4089-81d4-1ac09ec46b46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2183.388110] env[67977]: DEBUG oslo_concurrency.lockutils [req-9b7cbb70-3c7a-450e-881e-f5a1c304ab7b req-40dd3b77-07e8-4fc9-89c3-b3c1ceb8763d service nova] Lock "19f06190-ebc3-4089-81d4-1ac09ec46b46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2183.388262] env[67977]: DEBUG nova.compute.manager [req-9b7cbb70-3c7a-450e-881e-f5a1c304ab7b req-40dd3b77-07e8-4fc9-89c3-b3c1ceb8763d service nova] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] No waiting events found dispatching network-vif-plugged-8fcb546a-7bec-4795-9980-7bb28cc01ad0 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2183.388450] env[67977]: WARNING nova.compute.manager [req-9b7cbb70-3c7a-450e-881e-f5a1c304ab7b req-40dd3b77-07e8-4fc9-89c3-b3c1ceb8763d service nova] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Received unexpected event network-vif-plugged-8fcb546a-7bec-4795-9980-7bb28cc01ad0 for instance with vm_state building and task_state spawning. [ 2183.458460] env[67977]: DEBUG nova.network.neutron [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Successfully updated port: 8fcb546a-7bec-4795-9980-7bb28cc01ad0 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2183.468975] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "refresh_cache-19f06190-ebc3-4089-81d4-1ac09ec46b46" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2183.469138] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired lock "refresh_cache-19f06190-ebc3-4089-81d4-1ac09ec46b46" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2183.469418] env[67977]: DEBUG nova.network.neutron [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2183.505378] env[67977]: DEBUG nova.network.neutron [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2183.655475] env[67977]: DEBUG nova.network.neutron [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Updating instance_info_cache with network_info: [{"id": "8fcb546a-7bec-4795-9980-7bb28cc01ad0", "address": "fa:16:3e:21:9e:32", "network": {"id": "da4332ac-6c00-499e-81de-2b64bd556acc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1328497137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d528c04bd83409eb74e20393651c040", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ded18042-834c-4792-b3e8-b1c377446432", "external-id": "nsx-vlan-transportzone-293", "segmentation_id": 293, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fcb546a-7b", "ovs_interfaceid": "8fcb546a-7bec-4795-9980-7bb28cc01ad0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2183.666628] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Releasing lock "refresh_cache-19f06190-ebc3-4089-81d4-1ac09ec46b46" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2183.666895] env[67977]: DEBUG nova.compute.manager [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Instance network_info: |[{"id": "8fcb546a-7bec-4795-9980-7bb28cc01ad0", "address": "fa:16:3e:21:9e:32", "network": {"id": "da4332ac-6c00-499e-81de-2b64bd556acc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1328497137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d528c04bd83409eb74e20393651c040", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ded18042-834c-4792-b3e8-b1c377446432", "external-id": "nsx-vlan-transportzone-293", "segmentation_id": 293, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fcb546a-7b", "ovs_interfaceid": "8fcb546a-7bec-4795-9980-7bb28cc01ad0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2183.667305] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:21:9e:32', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ded18042-834c-4792-b3e8-b1c377446432', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8fcb546a-7bec-4795-9980-7bb28cc01ad0', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2183.674741] env[67977]: DEBUG oslo.service.loopingcall [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2183.675219] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2183.675440] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d48c7698-4290-4920-a177-540313c9ead4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2183.695539] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2183.695539] env[67977]: value = "task-3468298" [ 2183.695539] env[67977]: _type = "Task" [ 2183.695539] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2183.702968] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468298, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2184.205965] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468298, 'name': CreateVM_Task, 'duration_secs': 0.337615} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2184.206169] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2184.206844] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2184.207018] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2184.207335] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2184.207587] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4868c6d7-53fd-4a99-b9a0-ac4348ca922e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2184.212071] env[67977]: DEBUG oslo_vmware.api [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for the task: (returnval){ [ 2184.212071] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52a22538-1888-741c-4179-67332474c551" [ 2184.212071] env[67977]: _type = "Task" [ 2184.212071] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2184.219699] env[67977]: DEBUG oslo_vmware.api [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52a22538-1888-741c-4179-67332474c551, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2184.722391] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2184.722697] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2184.722840] env[67977]: DEBUG oslo_concurrency.lockutils [None req-53bb8cc5-60e3-4da4-85cf-415b5c715a8e tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2185.100890] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2185.101210] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2185.412433] env[67977]: DEBUG nova.compute.manager [req-d54bee0a-d36b-43a0-a561-e34d4d77bdce req-472fe1da-7b17-43bd-bd9e-48b7bdcc31ae service nova] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Received event network-changed-8fcb546a-7bec-4795-9980-7bb28cc01ad0 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2185.412587] env[67977]: DEBUG nova.compute.manager [req-d54bee0a-d36b-43a0-a561-e34d4d77bdce req-472fe1da-7b17-43bd-bd9e-48b7bdcc31ae service nova] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Refreshing instance network info cache due to event network-changed-8fcb546a-7bec-4795-9980-7bb28cc01ad0. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2185.412834] env[67977]: DEBUG oslo_concurrency.lockutils [req-d54bee0a-d36b-43a0-a561-e34d4d77bdce req-472fe1da-7b17-43bd-bd9e-48b7bdcc31ae service nova] Acquiring lock "refresh_cache-19f06190-ebc3-4089-81d4-1ac09ec46b46" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2185.412984] env[67977]: DEBUG oslo_concurrency.lockutils [req-d54bee0a-d36b-43a0-a561-e34d4d77bdce req-472fe1da-7b17-43bd-bd9e-48b7bdcc31ae service nova] Acquired lock "refresh_cache-19f06190-ebc3-4089-81d4-1ac09ec46b46" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2185.413160] env[67977]: DEBUG nova.network.neutron [req-d54bee0a-d36b-43a0-a561-e34d4d77bdce req-472fe1da-7b17-43bd-bd9e-48b7bdcc31ae service nova] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Refreshing network info cache for port 8fcb546a-7bec-4795-9980-7bb28cc01ad0 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2185.648507] env[67977]: DEBUG nova.network.neutron [req-d54bee0a-d36b-43a0-a561-e34d4d77bdce req-472fe1da-7b17-43bd-bd9e-48b7bdcc31ae service nova] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Updated VIF entry in instance network info cache for port 8fcb546a-7bec-4795-9980-7bb28cc01ad0. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2185.648861] env[67977]: DEBUG nova.network.neutron [req-d54bee0a-d36b-43a0-a561-e34d4d77bdce req-472fe1da-7b17-43bd-bd9e-48b7bdcc31ae service nova] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Updating instance_info_cache with network_info: [{"id": "8fcb546a-7bec-4795-9980-7bb28cc01ad0", "address": "fa:16:3e:21:9e:32", "network": {"id": "da4332ac-6c00-499e-81de-2b64bd556acc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1328497137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d528c04bd83409eb74e20393651c040", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ded18042-834c-4792-b3e8-b1c377446432", "external-id": "nsx-vlan-transportzone-293", "segmentation_id": 293, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fcb546a-7b", "ovs_interfaceid": "8fcb546a-7bec-4795-9980-7bb28cc01ad0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2185.658294] env[67977]: DEBUG oslo_concurrency.lockutils [req-d54bee0a-d36b-43a0-a561-e34d4d77bdce req-472fe1da-7b17-43bd-bd9e-48b7bdcc31ae service nova] Releasing lock "refresh_cache-19f06190-ebc3-4089-81d4-1ac09ec46b46" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2185.775547] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2186.775444] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2187.775253] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2187.775750] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2189.776170] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2190.770662] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2191.775555] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2191.775852] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2191.775904] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2191.799406] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.799571] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.799704] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.799831] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.799956] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.800091] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.800212] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.800327] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 396fd258-dc89-4392-a487-921958012e92] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.800443] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.800560] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2192.775582] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2192.787203] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2192.787426] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2192.787594] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2192.787748] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2192.788852] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5dc2a4c-e047-42c2-92c3-62ce83db7528 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.797691] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90727bfc-fd96-4b77-906c-bf0cb1bb9119 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.811326] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df453528-959a-4f92-8dd5-badb963094ac {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.817686] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fef8126a-536a-4fc5-95e9-d37c01164b68 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.846073] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180928MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2192.846190] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2192.846366] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2192.940153] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2192.940355] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2192.940486] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2192.940610] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2192.940731] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2192.940850] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2192.940970] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2192.941100] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 396fd258-dc89-4392-a487-921958012e92 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2192.941217] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 19f06190-ebc3-4089-81d4-1ac09ec46b46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2192.941411] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2192.941547] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2192.957781] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing inventories for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2192.971018] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating ProviderTree inventory for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2192.971223] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating inventory in ProviderTree for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2192.981736] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing aggregate associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, aggregates: None {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2192.998517] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing trait associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2193.101564] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-473ff307-7003-4365-978f-ceb087f13266 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.110687] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34c76830-d1bb-4200-bc7f-97cd1a9a27a7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.140280] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e1019fe-6b64-4b1b-89da-0bfcda9d9fa0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.147509] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbc31dd2-1a71-43ff-8031-679d6b390419 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.160541] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2193.169708] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2193.183700] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2193.183890] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.338s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2197.194705] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "1536ad10-129b-439d-80c5-08fa92aeaed1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2198.775867] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2198.776184] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2198.786016] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] There are 0 instances to clean {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2199.775386] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2199.775569] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances with incomplete migration {{(pid=67977) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2200.776062] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2229.518293] env[67977]: WARNING oslo_vmware.rw_handles [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2229.518293] env[67977]: ERROR oslo_vmware.rw_handles [ 2229.518989] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/1d586b62-b117-4a41-a9fe-a483f665581f/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2229.521360] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2229.521644] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Copying Virtual Disk [datastore1] vmware_temp/1d586b62-b117-4a41-a9fe-a483f665581f/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/1d586b62-b117-4a41-a9fe-a483f665581f/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2229.521942] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c98e64d7-b676-4931-8424-5b1774e9427f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.530110] env[67977]: DEBUG oslo_vmware.api [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 2229.530110] env[67977]: value = "task-3468299" [ 2229.530110] env[67977]: _type = "Task" [ 2229.530110] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2229.538470] env[67977]: DEBUG oslo_vmware.api [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': task-3468299, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2230.040717] env[67977]: DEBUG oslo_vmware.exceptions [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2230.041017] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2230.041593] env[67977]: ERROR nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2230.041593] env[67977]: Faults: ['InvalidArgument'] [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Traceback (most recent call last): [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] yield resources [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] self.driver.spawn(context, instance, image_meta, [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] self._fetch_image_if_missing(context, vi) [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2230.041593] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] image_cache(vi, tmp_image_ds_loc) [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] vm_util.copy_virtual_disk( [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] session._wait_for_task(vmdk_copy_task) [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] return self.wait_for_task(task_ref) [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] return evt.wait() [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] result = hub.switch() [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] return self.greenlet.switch() [ 2230.042044] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2230.042622] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] self.f(*self.args, **self.kw) [ 2230.042622] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2230.042622] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] raise exceptions.translate_fault(task_info.error) [ 2230.042622] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2230.042622] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Faults: ['InvalidArgument'] [ 2230.042622] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] [ 2230.042622] env[67977]: INFO nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Terminating instance [ 2230.043511] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2230.043730] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2230.043972] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dac8a01e-c153-4ef3-afbb-afc857064cd3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.046324] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2230.046519] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2230.047228] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cda04fdb-80d4-456a-a615-d3f3250dcc43 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.053831] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2230.054073] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0f054598-6db6-46aa-be7b-510479362a4c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.056135] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2230.056311] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2230.057230] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-114c318e-a4bc-4824-a6ec-8b93c8c3313f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.061837] env[67977]: DEBUG oslo_vmware.api [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 2230.061837] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52abcfc9-d5a5-0543-bf9d-3fd581aac729" [ 2230.061837] env[67977]: _type = "Task" [ 2230.061837] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2230.068521] env[67977]: DEBUG oslo_vmware.api [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52abcfc9-d5a5-0543-bf9d-3fd581aac729, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2230.130859] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2230.131077] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2230.131258] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Deleting the datastore file [datastore1] 157e3bfe-10cc-49c6-aa31-1d935e1a4465 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2230.131514] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f5f6a020-59cf-4cce-864e-21f01aedbcb5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.137412] env[67977]: DEBUG oslo_vmware.api [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 2230.137412] env[67977]: value = "task-3468301" [ 2230.137412] env[67977]: _type = "Task" [ 2230.137412] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2230.145257] env[67977]: DEBUG oslo_vmware.api [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': task-3468301, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2230.572093] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2230.572381] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating directory with path [datastore1] vmware_temp/ae9f2ed0-0a2b-4ac4-81ee-f4ca17e59388/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2230.572581] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-27bcb4bc-24a0-466f-a078-494f4d1601f7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.583809] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Created directory with path [datastore1] vmware_temp/ae9f2ed0-0a2b-4ac4-81ee-f4ca17e59388/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2230.583994] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Fetch image to [datastore1] vmware_temp/ae9f2ed0-0a2b-4ac4-81ee-f4ca17e59388/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2230.584183] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/ae9f2ed0-0a2b-4ac4-81ee-f4ca17e59388/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2230.584933] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f9753df-a4d7-4dbd-b98e-cb1150a6c6a8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.591524] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdc50d8f-ff76-4a33-bb37-53d24496dde3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.600411] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32e8a820-54f6-40ef-9f79-79c1185b7bfa {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.631521] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffd2c1fd-0a7d-4ccb-9ac1-a408f5c72c27 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.636904] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1c3b586e-3dee-489b-a4bb-90c11bc71d44 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.646015] env[67977]: DEBUG oslo_vmware.api [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': task-3468301, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064461} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2230.646015] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2230.646160] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2230.647796] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2230.647796] env[67977]: INFO nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2230.650066] env[67977]: DEBUG nova.compute.claims [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2230.650066] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2230.650066] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2230.660031] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2230.733962] env[67977]: DEBUG oslo_vmware.rw_handles [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae9f2ed0-0a2b-4ac4-81ee-f4ca17e59388/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2230.792836] env[67977]: DEBUG oslo_vmware.rw_handles [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2230.792836] env[67977]: DEBUG oslo_vmware.rw_handles [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae9f2ed0-0a2b-4ac4-81ee-f4ca17e59388/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2230.859235] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f11a9a80-eb36-4386-a4c6-a2e38745eb1a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.866676] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79a73f93-cf56-40c7-94bb-1aef68ddba57 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.898022] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6609f33-6974-4c1e-b181-d8b79b8e0d99 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.904320] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c183b71-399c-4aa7-a50f-e8b00f24201c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2230.918355] env[67977]: DEBUG nova.compute.provider_tree [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2230.927377] env[67977]: DEBUG nova.scheduler.client.report [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2230.943896] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.295s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2230.944427] env[67977]: ERROR nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2230.944427] env[67977]: Faults: ['InvalidArgument'] [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Traceback (most recent call last): [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] self.driver.spawn(context, instance, image_meta, [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] self._fetch_image_if_missing(context, vi) [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] image_cache(vi, tmp_image_ds_loc) [ 2230.944427] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] vm_util.copy_virtual_disk( [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] session._wait_for_task(vmdk_copy_task) [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] return self.wait_for_task(task_ref) [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] return evt.wait() [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] result = hub.switch() [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] return self.greenlet.switch() [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2230.944996] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] self.f(*self.args, **self.kw) [ 2230.945429] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2230.945429] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] raise exceptions.translate_fault(task_info.error) [ 2230.945429] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2230.945429] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Faults: ['InvalidArgument'] [ 2230.945429] env[67977]: ERROR nova.compute.manager [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] [ 2230.945429] env[67977]: DEBUG nova.compute.utils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2230.946572] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Build of instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 was re-scheduled: A specified parameter was not correct: fileType [ 2230.946572] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2230.946956] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2230.947119] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2230.947295] env[67977]: DEBUG nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2230.947457] env[67977]: DEBUG nova.network.neutron [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2231.234623] env[67977]: DEBUG nova.network.neutron [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2231.245657] env[67977]: INFO nova.compute.manager [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Took 0.30 seconds to deallocate network for instance. [ 2231.351305] env[67977]: INFO nova.scheduler.client.report [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Deleted allocations for instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 [ 2231.371200] env[67977]: DEBUG oslo_concurrency.lockutils [None req-4b947015-284c-4d3e-934e-ed1ced47acb5 tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 623.250s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2231.371466] env[67977]: DEBUG oslo_concurrency.lockutils [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.034s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2231.371685] env[67977]: DEBUG oslo_concurrency.lockutils [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2231.371892] env[67977]: DEBUG oslo_concurrency.lockutils [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2231.372073] env[67977]: DEBUG oslo_concurrency.lockutils [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2231.374105] env[67977]: INFO nova.compute.manager [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Terminating instance [ 2231.376180] env[67977]: DEBUG nova.compute.manager [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2231.376392] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2231.376868] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-292580f0-e52b-487d-82c2-c4fbd9d9596e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2231.385658] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffc7bf2d-7810-4cfd-961c-bb2ef37593c8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2231.412696] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 157e3bfe-10cc-49c6-aa31-1d935e1a4465 could not be found. [ 2231.412908] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2231.413114] env[67977]: INFO nova.compute.manager [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2231.413357] env[67977]: DEBUG oslo.service.loopingcall [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2231.413571] env[67977]: DEBUG nova.compute.manager [-] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2231.413667] env[67977]: DEBUG nova.network.neutron [-] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2231.436537] env[67977]: DEBUG nova.network.neutron [-] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2231.444015] env[67977]: INFO nova.compute.manager [-] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] Took 0.03 seconds to deallocate network for instance. [ 2231.529798] env[67977]: DEBUG oslo_concurrency.lockutils [None req-279997e3-4558-47a3-8f4c-19ed0766ea9e tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.158s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2231.530580] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 326.804s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2231.530767] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 157e3bfe-10cc-49c6-aa31-1d935e1a4465] During sync_power_state the instance has a pending task (deleting). Skip. [ 2231.530949] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "157e3bfe-10cc-49c6-aa31-1d935e1a4465" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2245.782712] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2246.775827] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2246.776096] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2247.775889] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2248.772215] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2249.775280] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2249.775599] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2250.776641] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2251.775969] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2251.776181] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2251.776278] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2251.793704] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2251.794098] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2251.794098] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2251.794098] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2251.794282] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2251.794337] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2251.794446] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 396fd258-dc89-4392-a487-921958012e92] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2251.794567] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2251.794686] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2252.789159] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2254.775914] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2254.787199] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2254.787427] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2254.787593] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2254.787745] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2254.788838] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54bee950-5626-4169-8fc4-7bcf095595cf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.797591] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea13a4c-031a-4b8d-8a6b-06c795e23ca8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.811196] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b05c8038-4196-43aa-a449-0662a129cc55 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.817212] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-282514f0-8710-4db3-aaa3-775d9942b12b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.845161] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180922MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2254.845313] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2254.845507] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2254.912634] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2254.912803] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2254.912960] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2254.913108] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2254.913255] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2254.913352] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2254.913467] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 396fd258-dc89-4392-a487-921958012e92 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2254.913585] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 19f06190-ebc3-4089-81d4-1ac09ec46b46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2254.913764] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2254.913903] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2255.010924] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cc49f90-aeb0-4bba-85eb-810162c1a83c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.019918] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37ce5fef-1af9-440b-b796-6f79f607250d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.048832] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0590f39e-0134-4e1f-8ae9-230a1a295600 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.055504] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33cab24d-7c93-4542-ade3-da063f627712 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.068460] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2255.076902] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2255.090212] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2255.090389] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.245s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2280.378426] env[67977]: WARNING oslo_vmware.rw_handles [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2280.378426] env[67977]: ERROR oslo_vmware.rw_handles [ 2280.379079] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/ae9f2ed0-0a2b-4ac4-81ee-f4ca17e59388/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2280.381225] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2280.381466] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Copying Virtual Disk [datastore1] vmware_temp/ae9f2ed0-0a2b-4ac4-81ee-f4ca17e59388/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/ae9f2ed0-0a2b-4ac4-81ee-f4ca17e59388/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2280.381740] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7ded1750-5733-4bf2-8ad5-2deaf3551115 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.389453] env[67977]: DEBUG oslo_vmware.api [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 2280.389453] env[67977]: value = "task-3468302" [ 2280.389453] env[67977]: _type = "Task" [ 2280.389453] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2280.397265] env[67977]: DEBUG oslo_vmware.api [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': task-3468302, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2280.900711] env[67977]: DEBUG oslo_vmware.exceptions [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2280.900711] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2280.901325] env[67977]: ERROR nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2280.901325] env[67977]: Faults: ['InvalidArgument'] [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Traceback (most recent call last): [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] yield resources [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] self.driver.spawn(context, instance, image_meta, [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] self._fetch_image_if_missing(context, vi) [ 2280.901325] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] image_cache(vi, tmp_image_ds_loc) [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] vm_util.copy_virtual_disk( [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] session._wait_for_task(vmdk_copy_task) [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] return self.wait_for_task(task_ref) [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] return evt.wait() [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] result = hub.switch() [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2280.901785] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] return self.greenlet.switch() [ 2280.902245] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2280.902245] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] self.f(*self.args, **self.kw) [ 2280.902245] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2280.902245] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] raise exceptions.translate_fault(task_info.error) [ 2280.902245] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2280.902245] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Faults: ['InvalidArgument'] [ 2280.902245] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] [ 2280.902245] env[67977]: INFO nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Terminating instance [ 2280.903455] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2280.903665] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2280.903913] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-082c2670-9e98-4e7d-9807-09a187c22866 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.906167] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2280.906363] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2280.907091] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7770f34-ec2e-4935-9320-79e2547eeefd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.915157] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2280.915362] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-30f3e336-0298-48d3-9b9c-c372aa761399 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.917418] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2280.917593] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2280.918509] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cf50c808-2860-4fbb-af3d-537255a272d4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.923176] env[67977]: DEBUG oslo_vmware.api [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for the task: (returnval){ [ 2280.923176] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52fdd13c-c5e4-df7a-ab9c-d003bd30aff1" [ 2280.923176] env[67977]: _type = "Task" [ 2280.923176] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2280.936713] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2280.937016] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Creating directory with path [datastore1] vmware_temp/8428ac55-46cf-4169-9d4d-17c9962f8cce/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2280.937175] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4707b3ea-dba4-498a-bbc2-7f78033675be {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.956975] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Created directory with path [datastore1] vmware_temp/8428ac55-46cf-4169-9d4d-17c9962f8cce/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2280.957179] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Fetch image to [datastore1] vmware_temp/8428ac55-46cf-4169-9d4d-17c9962f8cce/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2280.957356] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/8428ac55-46cf-4169-9d4d-17c9962f8cce/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2280.958172] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e29fb5c-ae97-41b2-85b2-8f3dcbefe608 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.964759] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d77d5300-0eb6-4146-9f27-82faac0f6e7d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.973497] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c58fb50c-bcb8-4d19-995d-7c7c703a0230 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2281.006039] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-734c46d2-7389-425c-bf94-9ec0005612df {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2281.008757] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2281.008962] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2281.009160] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Deleting the datastore file [datastore1] 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2281.009430] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9b0f2108-cedf-4a39-ae8a-00bba2872739 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2281.014507] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1dec8c1a-af98-4133-8103-3e9afa8ae9ff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2281.017395] env[67977]: DEBUG oslo_vmware.api [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 2281.017395] env[67977]: value = "task-3468304" [ 2281.017395] env[67977]: _type = "Task" [ 2281.017395] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2281.025094] env[67977]: DEBUG oslo_vmware.api [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': task-3468304, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2281.036072] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2281.090702] env[67977]: DEBUG oslo_vmware.rw_handles [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8428ac55-46cf-4169-9d4d-17c9962f8cce/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2281.166273] env[67977]: DEBUG oslo_vmware.rw_handles [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2281.166515] env[67977]: DEBUG oslo_vmware.rw_handles [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8428ac55-46cf-4169-9d4d-17c9962f8cce/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2281.528513] env[67977]: DEBUG oslo_vmware.api [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Task: {'id': task-3468304, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068838} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2281.528820] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2281.529018] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2281.529248] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2281.529463] env[67977]: INFO nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2281.531523] env[67977]: DEBUG nova.compute.claims [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2281.531732] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2281.531981] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2281.674935] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c752cc1f-b5fd-457c-b173-4f1e97ad9ff8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2281.682580] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd14b58b-135f-47ee-b770-d55ed8f80cec {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2281.713187] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d262ed6-cb49-4df9-bb38-55c053068b1d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2281.719988] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bf54ef4-e20b-4d84-8e11-8c0bd11d9876 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2281.733136] env[67977]: DEBUG nova.compute.provider_tree [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2281.741796] env[67977]: DEBUG nova.scheduler.client.report [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2281.756288] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.224s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2281.756808] env[67977]: ERROR nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2281.756808] env[67977]: Faults: ['InvalidArgument'] [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Traceback (most recent call last): [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] self.driver.spawn(context, instance, image_meta, [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] self._fetch_image_if_missing(context, vi) [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] image_cache(vi, tmp_image_ds_loc) [ 2281.756808] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] vm_util.copy_virtual_disk( [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] session._wait_for_task(vmdk_copy_task) [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] return self.wait_for_task(task_ref) [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] return evt.wait() [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] result = hub.switch() [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] return self.greenlet.switch() [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2281.757219] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] self.f(*self.args, **self.kw) [ 2281.757641] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2281.757641] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] raise exceptions.translate_fault(task_info.error) [ 2281.757641] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2281.757641] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Faults: ['InvalidArgument'] [ 2281.757641] env[67977]: ERROR nova.compute.manager [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] [ 2281.757641] env[67977]: DEBUG nova.compute.utils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2281.758813] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Build of instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 was re-scheduled: A specified parameter was not correct: fileType [ 2281.758813] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2281.759205] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2281.759382] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2281.759550] env[67977]: DEBUG nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2281.759712] env[67977]: DEBUG nova.network.neutron [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2282.123569] env[67977]: DEBUG nova.network.neutron [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2282.135103] env[67977]: INFO nova.compute.manager [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Took 0.38 seconds to deallocate network for instance. [ 2282.252413] env[67977]: INFO nova.scheduler.client.report [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Deleted allocations for instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 [ 2282.279921] env[67977]: DEBUG oslo_concurrency.lockutils [None req-21e1acbd-893f-4e38-b2b2-f6191fcf5748 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 523.848s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2282.280093] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 377.553s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2282.280204] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] During sync_power_state the instance has a pending task (spawning). Skip. [ 2282.280378] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2282.280595] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 327.110s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2282.280804] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2282.281015] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2282.281189] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2282.283156] env[67977]: INFO nova.compute.manager [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Terminating instance [ 2282.285972] env[67977]: DEBUG nova.compute.manager [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2282.286190] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2282.286453] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a4c0663d-81c0-4165-9588-a0385ac24869 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2282.295787] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e541481f-53ac-4892-93b0-1dc7b350aece {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2282.323478] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9 could not be found. [ 2282.323711] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2282.323893] env[67977]: INFO nova.compute.manager [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2282.324146] env[67977]: DEBUG oslo.service.loopingcall [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2282.324363] env[67977]: DEBUG nova.compute.manager [-] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2282.324459] env[67977]: DEBUG nova.network.neutron [-] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2282.346379] env[67977]: DEBUG nova.network.neutron [-] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2282.354145] env[67977]: INFO nova.compute.manager [-] [instance: 2be86118-6ff9-4cb0-b51b-16fa7d3a72e9] Took 0.03 seconds to deallocate network for instance. [ 2282.436869] env[67977]: DEBUG oslo_concurrency.lockutils [None req-fdc9cbf3-a1b0-4483-9898-5b59812e2e6c tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Lock "2be86118-6ff9-4cb0-b51b-16fa7d3a72e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.156s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2302.751702] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2302.752032] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2302.763158] env[67977]: DEBUG nova.compute.manager [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2302.810762] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2302.811009] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2302.812461] env[67977]: INFO nova.compute.claims [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2302.949067] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f258a68-f9db-449d-8d3c-b0130eb7a0e1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.956376] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-759d7913-6bc1-470e-8d60-a87ce8a83d86 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.987124] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a9a4ca3-76e6-4e14-9191-919cf73c3894 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.994064] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f84e7f0-96d8-428e-8eb9-93145740e3cb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.006935] env[67977]: DEBUG nova.compute.provider_tree [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2303.015539] env[67977]: DEBUG nova.scheduler.client.report [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2303.028873] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2303.029379] env[67977]: DEBUG nova.compute.manager [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2303.062768] env[67977]: DEBUG nova.compute.utils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2303.064367] env[67977]: DEBUG nova.compute.manager [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2303.064556] env[67977]: DEBUG nova.network.neutron [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2303.074071] env[67977]: DEBUG nova.compute.manager [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2303.141027] env[67977]: DEBUG nova.policy [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd71a62e4fe3f4a59b7606ef17ea6d0b9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6a2fbea0321a44e7ac6812f9856e8116', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 2303.149313] env[67977]: DEBUG nova.compute.manager [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2303.176328] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2303.176596] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2303.176756] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2303.176936] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2303.177101] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2303.177270] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2303.177456] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2303.177615] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2303.177784] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2303.177966] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2303.178202] env[67977]: DEBUG nova.virt.hardware [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2303.179072] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55226162-c4b5-4a9f-8f9d-308a0fda7a89 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.188661] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2726e97-e0f1-48d9-94c3-95fcdf165ae5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.475009] env[67977]: DEBUG nova.network.neutron [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Successfully created port: 46721cd0-f9a2-4893-a348-1278fa53aa0a {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2303.995932] env[67977]: DEBUG nova.compute.manager [req-41e3c26a-d21e-46d5-89fa-13207264182a req-3ce36e5e-605a-4dd3-82b2-8376c8001cc6 service nova] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Received event network-vif-plugged-46721cd0-f9a2-4893-a348-1278fa53aa0a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2303.996232] env[67977]: DEBUG oslo_concurrency.lockutils [req-41e3c26a-d21e-46d5-89fa-13207264182a req-3ce36e5e-605a-4dd3-82b2-8376c8001cc6 service nova] Acquiring lock "da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2303.996416] env[67977]: DEBUG oslo_concurrency.lockutils [req-41e3c26a-d21e-46d5-89fa-13207264182a req-3ce36e5e-605a-4dd3-82b2-8376c8001cc6 service nova] Lock "da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2303.996579] env[67977]: DEBUG oslo_concurrency.lockutils [req-41e3c26a-d21e-46d5-89fa-13207264182a req-3ce36e5e-605a-4dd3-82b2-8376c8001cc6 service nova] Lock "da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2303.996743] env[67977]: DEBUG nova.compute.manager [req-41e3c26a-d21e-46d5-89fa-13207264182a req-3ce36e5e-605a-4dd3-82b2-8376c8001cc6 service nova] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] No waiting events found dispatching network-vif-plugged-46721cd0-f9a2-4893-a348-1278fa53aa0a {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2303.996903] env[67977]: WARNING nova.compute.manager [req-41e3c26a-d21e-46d5-89fa-13207264182a req-3ce36e5e-605a-4dd3-82b2-8376c8001cc6 service nova] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Received unexpected event network-vif-plugged-46721cd0-f9a2-4893-a348-1278fa53aa0a for instance with vm_state building and task_state spawning. [ 2304.072649] env[67977]: DEBUG nova.network.neutron [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Successfully updated port: 46721cd0-f9a2-4893-a348-1278fa53aa0a {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2304.083431] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "refresh_cache-da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2304.083582] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquired lock "refresh_cache-da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2304.083731] env[67977]: DEBUG nova.network.neutron [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2304.123373] env[67977]: DEBUG nova.network.neutron [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2304.317918] env[67977]: DEBUG nova.network.neutron [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Updating instance_info_cache with network_info: [{"id": "46721cd0-f9a2-4893-a348-1278fa53aa0a", "address": "fa:16:3e:a0:6c:32", "network": {"id": "8472d1a8-8b3a-40f7-a74a-3449f67e4cb2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1078881203-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6a2fbea0321a44e7ac6812f9856e8116", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6815237d-f565-474d-a3c0-9c675478eb00", "external-id": "nsx-vlan-transportzone-526", "segmentation_id": 526, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap46721cd0-f9", "ovs_interfaceid": "46721cd0-f9a2-4893-a348-1278fa53aa0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2304.330342] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Releasing lock "refresh_cache-da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2304.330612] env[67977]: DEBUG nova.compute.manager [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Instance network_info: |[{"id": "46721cd0-f9a2-4893-a348-1278fa53aa0a", "address": "fa:16:3e:a0:6c:32", "network": {"id": "8472d1a8-8b3a-40f7-a74a-3449f67e4cb2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1078881203-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6a2fbea0321a44e7ac6812f9856e8116", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6815237d-f565-474d-a3c0-9c675478eb00", "external-id": "nsx-vlan-transportzone-526", "segmentation_id": 526, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap46721cd0-f9", "ovs_interfaceid": "46721cd0-f9a2-4893-a348-1278fa53aa0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2304.330978] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a0:6c:32', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6815237d-f565-474d-a3c0-9c675478eb00', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '46721cd0-f9a2-4893-a348-1278fa53aa0a', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2304.338926] env[67977]: DEBUG oslo.service.loopingcall [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2304.339369] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2304.339598] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-698a6498-01cb-45aa-bf3d-c94b9910c168 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.361917] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2304.361917] env[67977]: value = "task-3468305" [ 2304.361917] env[67977]: _type = "Task" [ 2304.361917] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2304.369499] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468305, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2304.872473] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468305, 'name': CreateVM_Task, 'duration_secs': 0.293411} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2304.872695] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2304.873398] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2304.873645] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2304.874103] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2304.874434] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-410252e0-de61-4abc-b67a-050e6e12eb31 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.879492] env[67977]: DEBUG oslo_vmware.api [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Waiting for the task: (returnval){ [ 2304.879492] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525050ce-55ca-8753-816f-fdda5c4bc442" [ 2304.879492] env[67977]: _type = "Task" [ 2304.879492] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2304.887134] env[67977]: DEBUG oslo_vmware.api [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]525050ce-55ca-8753-816f-fdda5c4bc442, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2305.390186] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2305.390494] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2305.390626] env[67977]: DEBUG oslo_concurrency.lockutils [None req-946f01f9-e2a2-4bf0-a1a0-a5ff0a6bf799 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2306.024056] env[67977]: DEBUG nova.compute.manager [req-3fdfc4d8-8f7c-4806-8c11-8a5e9c70c428 req-ce627400-e6cd-4f05-bfa1-a39b51d5119f service nova] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Received event network-changed-46721cd0-f9a2-4893-a348-1278fa53aa0a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2306.024265] env[67977]: DEBUG nova.compute.manager [req-3fdfc4d8-8f7c-4806-8c11-8a5e9c70c428 req-ce627400-e6cd-4f05-bfa1-a39b51d5119f service nova] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Refreshing instance network info cache due to event network-changed-46721cd0-f9a2-4893-a348-1278fa53aa0a. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2306.024480] env[67977]: DEBUG oslo_concurrency.lockutils [req-3fdfc4d8-8f7c-4806-8c11-8a5e9c70c428 req-ce627400-e6cd-4f05-bfa1-a39b51d5119f service nova] Acquiring lock "refresh_cache-da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2306.024624] env[67977]: DEBUG oslo_concurrency.lockutils [req-3fdfc4d8-8f7c-4806-8c11-8a5e9c70c428 req-ce627400-e6cd-4f05-bfa1-a39b51d5119f service nova] Acquired lock "refresh_cache-da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2306.024784] env[67977]: DEBUG nova.network.neutron [req-3fdfc4d8-8f7c-4806-8c11-8a5e9c70c428 req-ce627400-e6cd-4f05-bfa1-a39b51d5119f service nova] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Refreshing network info cache for port 46721cd0-f9a2-4893-a348-1278fa53aa0a {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2306.299796] env[67977]: DEBUG nova.network.neutron [req-3fdfc4d8-8f7c-4806-8c11-8a5e9c70c428 req-ce627400-e6cd-4f05-bfa1-a39b51d5119f service nova] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Updated VIF entry in instance network info cache for port 46721cd0-f9a2-4893-a348-1278fa53aa0a. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2306.300167] env[67977]: DEBUG nova.network.neutron [req-3fdfc4d8-8f7c-4806-8c11-8a5e9c70c428 req-ce627400-e6cd-4f05-bfa1-a39b51d5119f service nova] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Updating instance_info_cache with network_info: [{"id": "46721cd0-f9a2-4893-a348-1278fa53aa0a", "address": "fa:16:3e:a0:6c:32", "network": {"id": "8472d1a8-8b3a-40f7-a74a-3449f67e4cb2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1078881203-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6a2fbea0321a44e7ac6812f9856e8116", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6815237d-f565-474d-a3c0-9c675478eb00", "external-id": "nsx-vlan-transportzone-526", "segmentation_id": 526, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap46721cd0-f9", "ovs_interfaceid": "46721cd0-f9a2-4893-a348-1278fa53aa0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2306.309665] env[67977]: DEBUG oslo_concurrency.lockutils [req-3fdfc4d8-8f7c-4806-8c11-8a5e9c70c428 req-ce627400-e6cd-4f05-bfa1-a39b51d5119f service nova] Releasing lock "refresh_cache-da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2307.090648] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2307.775759] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2307.775759] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2309.776756] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2310.775452] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2311.776397] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2311.776397] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2313.775824] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2313.776235] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2313.776235] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2313.795352] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2313.795518] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2313.795631] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2313.795795] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2313.795960] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2313.796224] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 396fd258-dc89-4392-a487-921958012e92] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2313.796375] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2313.796500] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2313.796624] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2314.791198] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2316.775628] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2316.788084] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2316.788323] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.788490] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.788646] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2316.789781] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-978208cd-9089-4839-b3e5-e0de56cfb9b3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.798431] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d69038fc-17ce-4942-8648-ceb916e646fa {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.813292] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26ca4932-295e-44c3-94a5-238488791d3e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.819473] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dda8d79f-5398-4eea-8f85-027971af0c1e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.847750] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180931MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2316.847895] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2316.848098] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.914751] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.914906] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.915077] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.915203] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.915320] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.915434] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 396fd258-dc89-4392-a487-921958012e92 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.915548] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 19f06190-ebc3-4089-81d4-1ac09ec46b46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.915660] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.915838] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2316.916011] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2317.009694] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bd82e8b-de3e-40c0-94ae-bec21e2c6f3c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.017370] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5dbe738-84a4-4e8d-8e97-c96bd2a3ed64 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.047091] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01e7f2da-6bc3-46be-b75d-efafbeaa66f7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.053890] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a23288a-5123-409c-8b60-5ca2706c04c5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.066563] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2317.074309] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2317.087012] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2317.087214] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2327.799580] env[67977]: WARNING oslo_vmware.rw_handles [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2327.799580] env[67977]: ERROR oslo_vmware.rw_handles [ 2327.800442] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/8428ac55-46cf-4169-9d4d-17c9962f8cce/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2327.802290] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2327.802528] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Copying Virtual Disk [datastore1] vmware_temp/8428ac55-46cf-4169-9d4d-17c9962f8cce/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/8428ac55-46cf-4169-9d4d-17c9962f8cce/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2327.802809] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-72554b67-e9a4-48bb-8100-cd8dcbccf967 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.811084] env[67977]: DEBUG oslo_vmware.api [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for the task: (returnval){ [ 2327.811084] env[67977]: value = "task-3468306" [ 2327.811084] env[67977]: _type = "Task" [ 2327.811084] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2327.818893] env[67977]: DEBUG oslo_vmware.api [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Task: {'id': task-3468306, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2328.321313] env[67977]: DEBUG oslo_vmware.exceptions [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2328.321613] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2328.322179] env[67977]: ERROR nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2328.322179] env[67977]: Faults: ['InvalidArgument'] [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Traceback (most recent call last): [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] yield resources [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] self.driver.spawn(context, instance, image_meta, [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] self._fetch_image_if_missing(context, vi) [ 2328.322179] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] image_cache(vi, tmp_image_ds_loc) [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] vm_util.copy_virtual_disk( [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] session._wait_for_task(vmdk_copy_task) [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] return self.wait_for_task(task_ref) [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] return evt.wait() [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] result = hub.switch() [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2328.322593] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] return self.greenlet.switch() [ 2328.323035] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2328.323035] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] self.f(*self.args, **self.kw) [ 2328.323035] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2328.323035] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] raise exceptions.translate_fault(task_info.error) [ 2328.323035] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2328.323035] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Faults: ['InvalidArgument'] [ 2328.323035] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] [ 2328.323035] env[67977]: INFO nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Terminating instance [ 2328.324025] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2328.324256] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2328.324493] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3093246a-17ef-4468-b98b-935aa3530f93 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.326899] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2328.327121] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2328.327846] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49c155c5-fbb3-42ba-a590-a581fd208f73 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.334353] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2328.334558] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-71c35ae0-11f7-40a3-b387-af5ffc69a729 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.336667] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2328.336838] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2328.337775] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-054b07de-7135-4169-bae3-76a5c0b5912b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.342319] env[67977]: DEBUG oslo_vmware.api [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for the task: (returnval){ [ 2328.342319] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d3d243-3563-1bd8-6166-73f314137211" [ 2328.342319] env[67977]: _type = "Task" [ 2328.342319] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2328.348937] env[67977]: DEBUG oslo_vmware.api [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d3d243-3563-1bd8-6166-73f314137211, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2328.403738] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2328.403981] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2328.404189] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Deleting the datastore file [datastore1] d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2328.404462] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-20e91a76-a411-45ff-8fde-b67d03a80fad {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.410735] env[67977]: DEBUG oslo_vmware.api [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for the task: (returnval){ [ 2328.410735] env[67977]: value = "task-3468308" [ 2328.410735] env[67977]: _type = "Task" [ 2328.410735] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2328.418114] env[67977]: DEBUG oslo_vmware.api [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Task: {'id': task-3468308, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2328.853304] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2328.853624] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Creating directory with path [datastore1] vmware_temp/abaa4824-3656-4146-bf0b-a47ff284dc0c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2328.853849] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b3c1efac-3c90-4d7f-af41-320d1c3eb18d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.864667] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Created directory with path [datastore1] vmware_temp/abaa4824-3656-4146-bf0b-a47ff284dc0c/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2328.864845] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Fetch image to [datastore1] vmware_temp/abaa4824-3656-4146-bf0b-a47ff284dc0c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2328.865048] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/abaa4824-3656-4146-bf0b-a47ff284dc0c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2328.865762] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-046771c1-0c05-4b89-8c7f-a634a0a62130 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.872367] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adec2e2a-6df9-4957-9e07-f638edccc834 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.881331] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ab09f11-ba31-44f3-91b6-134cc91fceef {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.914923] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5900e22e-458a-40aa-a739-2577c72b52a0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.921660] env[67977]: DEBUG oslo_vmware.api [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Task: {'id': task-3468308, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079018} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2328.923075] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2328.923274] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2328.923443] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2328.923617] env[67977]: INFO nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2328.925605] env[67977]: DEBUG nova.compute.claims [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2328.925771] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2328.925982] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2328.928489] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-198b4971-9702-4ce3-abaa-606ba62f983d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2328.949398] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2329.000988] env[67977]: DEBUG oslo_vmware.rw_handles [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/abaa4824-3656-4146-bf0b-a47ff284dc0c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2329.060868] env[67977]: DEBUG oslo_vmware.rw_handles [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2329.061064] env[67977]: DEBUG oslo_vmware.rw_handles [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/abaa4824-3656-4146-bf0b-a47ff284dc0c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2329.119723] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e700eb79-121d-4a0e-8438-b5d643f99e3b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2329.127501] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a984134-ae29-4e81-807a-20abc01a4396 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2329.158319] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f408495-22a4-4729-9dc2-429043f00869 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2329.165175] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6b3e3a1-4963-4a36-bbfd-5734b196a178 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2329.177697] env[67977]: DEBUG nova.compute.provider_tree [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2329.185600] env[67977]: DEBUG nova.scheduler.client.report [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2329.199455] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.273s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2329.199961] env[67977]: ERROR nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2329.199961] env[67977]: Faults: ['InvalidArgument'] [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Traceback (most recent call last): [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] self.driver.spawn(context, instance, image_meta, [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] self._fetch_image_if_missing(context, vi) [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] image_cache(vi, tmp_image_ds_loc) [ 2329.199961] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] vm_util.copy_virtual_disk( [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] session._wait_for_task(vmdk_copy_task) [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] return self.wait_for_task(task_ref) [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] return evt.wait() [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] result = hub.switch() [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] return self.greenlet.switch() [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2329.200448] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] self.f(*self.args, **self.kw) [ 2329.200881] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2329.200881] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] raise exceptions.translate_fault(task_info.error) [ 2329.200881] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2329.200881] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Faults: ['InvalidArgument'] [ 2329.200881] env[67977]: ERROR nova.compute.manager [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] [ 2329.200881] env[67977]: DEBUG nova.compute.utils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2329.201935] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Build of instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 was re-scheduled: A specified parameter was not correct: fileType [ 2329.201935] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2329.202331] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2329.202508] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2329.202676] env[67977]: DEBUG nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2329.202839] env[67977]: DEBUG nova.network.neutron [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2329.491042] env[67977]: DEBUG nova.network.neutron [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2329.528651] env[67977]: INFO nova.compute.manager [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Took 0.33 seconds to deallocate network for instance. [ 2329.646314] env[67977]: INFO nova.scheduler.client.report [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Deleted allocations for instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 [ 2329.667096] env[67977]: DEBUG oslo_concurrency.lockutils [None req-ff853cd4-399c-4da9-90cd-62b25506fed2 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 555.205s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2329.667598] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 424.940s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2329.667793] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] During sync_power_state the instance has a pending task (spawning). Skip. [ 2329.667967] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2329.668513] env[67977]: DEBUG oslo_concurrency.lockutils [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 358.881s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2329.668686] env[67977]: DEBUG oslo_concurrency.lockutils [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Acquiring lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2329.668889] env[67977]: DEBUG oslo_concurrency.lockutils [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2329.669142] env[67977]: DEBUG oslo_concurrency.lockutils [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2329.670940] env[67977]: INFO nova.compute.manager [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Terminating instance [ 2329.672635] env[67977]: DEBUG nova.compute.manager [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2329.672821] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2329.673106] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-526dbc4c-8df2-49a4-bc47-953ee98475c4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2329.682880] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acf62dc6-b9e6-492b-9c26-22f39601adc4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2329.713295] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8 could not be found. [ 2329.713525] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2329.713690] env[67977]: INFO nova.compute.manager [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2329.714067] env[67977]: DEBUG oslo.service.loopingcall [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2329.714173] env[67977]: DEBUG nova.compute.manager [-] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2329.714262] env[67977]: DEBUG nova.network.neutron [-] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2329.739040] env[67977]: DEBUG nova.network.neutron [-] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2329.748981] env[67977]: INFO nova.compute.manager [-] [instance: d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8] Took 0.03 seconds to deallocate network for instance. [ 2329.831343] env[67977]: DEBUG oslo_concurrency.lockutils [None req-a38f9bc2-e2a9-4e9a-9828-af247cfbceb8 tempest-AttachVolumeNegativeTest-1518646279 tempest-AttachVolumeNegativeTest-1518646279-project-member] Lock "d8d5bc99-47e4-44a3-b2e0-aaf90c2393d8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.163s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2345.862028] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "d4339fd1-c694-4349-bc12-4d915fa23079" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2345.862028] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "d4339fd1-c694-4349-bc12-4d915fa23079" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2345.873670] env[67977]: DEBUG nova.compute.manager [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2345.923504] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2345.923762] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2345.925238] env[67977]: INFO nova.compute.claims [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2346.064806] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d56f1ab-4dcf-4492-a35e-1dd8d69e69c8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2346.072272] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a1ca630-2058-4718-86f9-4c4b59a77267 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2346.100549] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-446eb68e-a531-47a0-bae5-888f07e40f18 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2346.107514] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19452782-3e4d-4d88-b002-e9fbaa81c227 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2346.121185] env[67977]: DEBUG nova.compute.provider_tree [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2346.129978] env[67977]: DEBUG nova.scheduler.client.report [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2346.147131] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.223s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2346.147642] env[67977]: DEBUG nova.compute.manager [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2346.181427] env[67977]: DEBUG nova.compute.utils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2346.182977] env[67977]: DEBUG nova.compute.manager [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2346.183556] env[67977]: DEBUG nova.network.neutron [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2346.192484] env[67977]: DEBUG nova.compute.manager [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2346.247713] env[67977]: DEBUG nova.policy [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78df84566c65469890b3b6f15f3e5e01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ff581ae563e45108f497cade6990d79', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 2346.253114] env[67977]: DEBUG nova.compute.manager [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2346.277976] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2346.278136] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2346.279021] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2346.279021] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2346.279021] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2346.279021] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2346.279021] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2346.279276] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2346.279276] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2346.279411] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2346.279580] env[67977]: DEBUG nova.virt.hardware [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2346.280433] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02db09d5-dcd1-49e4-904a-d81bdf905aff {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2346.288120] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b147b59-e001-4068-9497-a9cef508e482 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2346.530099] env[67977]: DEBUG nova.network.neutron [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Successfully created port: 73567979-be02-4ffd-b4fd-24c39cba4219 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2347.155743] env[67977]: DEBUG nova.compute.manager [req-aafc3452-c4b2-4c98-9371-1f88144dfda3 req-3121a29e-d495-452b-9e30-daead52a3d12 service nova] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Received event network-vif-plugged-73567979-be02-4ffd-b4fd-24c39cba4219 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2347.156053] env[67977]: DEBUG oslo_concurrency.lockutils [req-aafc3452-c4b2-4c98-9371-1f88144dfda3 req-3121a29e-d495-452b-9e30-daead52a3d12 service nova] Acquiring lock "d4339fd1-c694-4349-bc12-4d915fa23079-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2347.156180] env[67977]: DEBUG oslo_concurrency.lockutils [req-aafc3452-c4b2-4c98-9371-1f88144dfda3 req-3121a29e-d495-452b-9e30-daead52a3d12 service nova] Lock "d4339fd1-c694-4349-bc12-4d915fa23079-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2347.156375] env[67977]: DEBUG oslo_concurrency.lockutils [req-aafc3452-c4b2-4c98-9371-1f88144dfda3 req-3121a29e-d495-452b-9e30-daead52a3d12 service nova] Lock "d4339fd1-c694-4349-bc12-4d915fa23079-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2347.156540] env[67977]: DEBUG nova.compute.manager [req-aafc3452-c4b2-4c98-9371-1f88144dfda3 req-3121a29e-d495-452b-9e30-daead52a3d12 service nova] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] No waiting events found dispatching network-vif-plugged-73567979-be02-4ffd-b4fd-24c39cba4219 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2347.156699] env[67977]: WARNING nova.compute.manager [req-aafc3452-c4b2-4c98-9371-1f88144dfda3 req-3121a29e-d495-452b-9e30-daead52a3d12 service nova] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Received unexpected event network-vif-plugged-73567979-be02-4ffd-b4fd-24c39cba4219 for instance with vm_state building and task_state spawning. [ 2347.241083] env[67977]: DEBUG nova.network.neutron [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Successfully updated port: 73567979-be02-4ffd-b4fd-24c39cba4219 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2347.257130] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "refresh_cache-d4339fd1-c694-4349-bc12-4d915fa23079" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2347.257298] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "refresh_cache-d4339fd1-c694-4349-bc12-4d915fa23079" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2347.257460] env[67977]: DEBUG nova.network.neutron [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2347.294217] env[67977]: DEBUG nova.network.neutron [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2347.445283] env[67977]: DEBUG nova.network.neutron [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Updating instance_info_cache with network_info: [{"id": "73567979-be02-4ffd-b4fd-24c39cba4219", "address": "fa:16:3e:2a:41:34", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73567979-be", "ovs_interfaceid": "73567979-be02-4ffd-b4fd-24c39cba4219", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2347.455368] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "refresh_cache-d4339fd1-c694-4349-bc12-4d915fa23079" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2347.455649] env[67977]: DEBUG nova.compute.manager [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Instance network_info: |[{"id": "73567979-be02-4ffd-b4fd-24c39cba4219", "address": "fa:16:3e:2a:41:34", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73567979-be", "ovs_interfaceid": "73567979-be02-4ffd-b4fd-24c39cba4219", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2347.456042] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2a:41:34', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5efce30e-48dd-493a-a354-f562a8adf7af', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '73567979-be02-4ffd-b4fd-24c39cba4219', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2347.464075] env[67977]: DEBUG oslo.service.loopingcall [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2347.464075] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2347.464262] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0b4c9b12-0a11-4e94-a9fd-86f971699ea6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.484532] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2347.484532] env[67977]: value = "task-3468309" [ 2347.484532] env[67977]: _type = "Task" [ 2347.484532] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2347.492067] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468309, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2347.994890] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468309, 'name': CreateVM_Task, 'duration_secs': 0.307818} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2347.995073] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2347.995873] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2347.996060] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2347.996431] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2347.996680] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ec7e47e3-8669-4852-8ac8-609c4b4a4890 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.001263] env[67977]: DEBUG oslo_vmware.api [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 2348.001263] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52747ac5-2819-bd64-551e-998b791fdc4f" [ 2348.001263] env[67977]: _type = "Task" [ 2348.001263] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2348.008493] env[67977]: DEBUG oslo_vmware.api [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52747ac5-2819-bd64-551e-998b791fdc4f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2348.292145] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b9fbe154-99b9-4687-b07e-a5134d7a9839 tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquiring lock "396fd258-dc89-4392-a487-921958012e92" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2348.510947] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2348.511215] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2348.511427] env[67977]: DEBUG oslo_concurrency.lockutils [None req-8d44de52-4bd9-40ce-bc27-f7afad000eae tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2349.187682] env[67977]: DEBUG nova.compute.manager [req-b3dd60bb-78fe-411e-9355-0e04cd6c5d17 req-bf242962-3634-4697-afa4-dfdb6134239b service nova] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Received event network-changed-73567979-be02-4ffd-b4fd-24c39cba4219 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2349.187897] env[67977]: DEBUG nova.compute.manager [req-b3dd60bb-78fe-411e-9355-0e04cd6c5d17 req-bf242962-3634-4697-afa4-dfdb6134239b service nova] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Refreshing instance network info cache due to event network-changed-73567979-be02-4ffd-b4fd-24c39cba4219. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2349.188125] env[67977]: DEBUG oslo_concurrency.lockutils [req-b3dd60bb-78fe-411e-9355-0e04cd6c5d17 req-bf242962-3634-4697-afa4-dfdb6134239b service nova] Acquiring lock "refresh_cache-d4339fd1-c694-4349-bc12-4d915fa23079" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2349.188277] env[67977]: DEBUG oslo_concurrency.lockutils [req-b3dd60bb-78fe-411e-9355-0e04cd6c5d17 req-bf242962-3634-4697-afa4-dfdb6134239b service nova] Acquired lock "refresh_cache-d4339fd1-c694-4349-bc12-4d915fa23079" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2349.188441] env[67977]: DEBUG nova.network.neutron [req-b3dd60bb-78fe-411e-9355-0e04cd6c5d17 req-bf242962-3634-4697-afa4-dfdb6134239b service nova] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Refreshing network info cache for port 73567979-be02-4ffd-b4fd-24c39cba4219 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2349.457135] env[67977]: DEBUG nova.network.neutron [req-b3dd60bb-78fe-411e-9355-0e04cd6c5d17 req-bf242962-3634-4697-afa4-dfdb6134239b service nova] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Updated VIF entry in instance network info cache for port 73567979-be02-4ffd-b4fd-24c39cba4219. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2349.457765] env[67977]: DEBUG nova.network.neutron [req-b3dd60bb-78fe-411e-9355-0e04cd6c5d17 req-bf242962-3634-4697-afa4-dfdb6134239b service nova] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Updating instance_info_cache with network_info: [{"id": "73567979-be02-4ffd-b4fd-24c39cba4219", "address": "fa:16:3e:2a:41:34", "network": {"id": "3cc90710-6851-4be6-9664-050d83601d76", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1624138118-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff581ae563e45108f497cade6990d79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5efce30e-48dd-493a-a354-f562a8adf7af", "external-id": "nsx-vlan-transportzone-283", "segmentation_id": 283, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73567979-be", "ovs_interfaceid": "73567979-be02-4ffd-b4fd-24c39cba4219", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2349.466769] env[67977]: DEBUG oslo_concurrency.lockutils [req-b3dd60bb-78fe-411e-9355-0e04cd6c5d17 req-bf242962-3634-4697-afa4-dfdb6134239b service nova] Releasing lock "refresh_cache-d4339fd1-c694-4349-bc12-4d915fa23079" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2368.087802] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2368.776228] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2368.776513] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2371.772953] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2371.794655] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2371.794655] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2373.775583] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2373.775855] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2375.777767] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2375.777767] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2375.777767] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2375.797041] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.797041] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.797041] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.797041] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.797041] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 396fd258-dc89-4392-a487-921958012e92] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.797291] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.797291] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.797291] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.797291] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2376.791886] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2377.775620] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2377.787312] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2377.787574] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2377.787778] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2377.787942] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2377.790042] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14baf42d-6ed1-423d-b13e-5a58df2864ee {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.797769] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ade92c8e-c735-4b25-ae93-64f8773ecc18 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.813268] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a50f68b-7bc5-485b-9f91-ad6a15288162 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.819456] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbd6ed07-4414-4106-8f9d-570966c046f8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.848272] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180938MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2377.848435] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2377.848612] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2377.913989] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2377.914166] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2377.914298] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2377.914491] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2377.914625] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 396fd258-dc89-4392-a487-921958012e92 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2377.914745] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 19f06190-ebc3-4089-81d4-1ac09ec46b46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2377.914863] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2377.914979] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d4339fd1-c694-4349-bc12-4d915fa23079 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2377.915170] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2377.915336] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2378.009538] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c6e9945-e200-471f-91f1-2fc2a6fe06cd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.017010] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caa435d8-092f-49e4-97db-40abe10ed283 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.046999] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1479a1f7-de38-4a8d-a6f4-8d361dbc9ec6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.053887] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b35a3553-f0dd-4fd1-ad77-b34e0fbfc7cf {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.066950] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2378.074937] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2378.088058] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2378.088200] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2378.118748] env[67977]: DEBUG oslo_concurrency.lockutils [None req-2bf88ef4-1581-4922-b2c6-cd510b737d38 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "19f06190-ebc3-4089-81d4-1ac09ec46b46" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2378.350442] env[67977]: WARNING oslo_vmware.rw_handles [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2378.350442] env[67977]: ERROR oslo_vmware.rw_handles [ 2378.350874] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/abaa4824-3656-4146-bf0b-a47ff284dc0c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2378.352981] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2378.353266] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Copying Virtual Disk [datastore1] vmware_temp/abaa4824-3656-4146-bf0b-a47ff284dc0c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/abaa4824-3656-4146-bf0b-a47ff284dc0c/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2378.353572] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-70007c9c-073a-4728-99be-5e83f42f9802 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.362205] env[67977]: DEBUG oslo_vmware.api [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for the task: (returnval){ [ 2378.362205] env[67977]: value = "task-3468310" [ 2378.362205] env[67977]: _type = "Task" [ 2378.362205] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2378.370981] env[67977]: DEBUG oslo_vmware.api [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': task-3468310, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2378.872958] env[67977]: DEBUG oslo_vmware.exceptions [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2378.872958] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2378.873380] env[67977]: ERROR nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2378.873380] env[67977]: Faults: ['InvalidArgument'] [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Traceback (most recent call last): [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] yield resources [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] self.driver.spawn(context, instance, image_meta, [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] self._fetch_image_if_missing(context, vi) [ 2378.873380] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] image_cache(vi, tmp_image_ds_loc) [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] vm_util.copy_virtual_disk( [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] session._wait_for_task(vmdk_copy_task) [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] return self.wait_for_task(task_ref) [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] return evt.wait() [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] result = hub.switch() [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2378.873692] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] return self.greenlet.switch() [ 2378.874020] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2378.874020] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] self.f(*self.args, **self.kw) [ 2378.874020] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2378.874020] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] raise exceptions.translate_fault(task_info.error) [ 2378.874020] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2378.874020] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Faults: ['InvalidArgument'] [ 2378.874020] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] [ 2378.874020] env[67977]: INFO nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Terminating instance [ 2378.875258] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2378.875509] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2378.875754] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f1d52570-e229-4a0e-99e4-d8187bc90c8f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.877926] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2378.878138] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2378.878881] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aca3cb41-252b-46da-a825-e912ba900cc7 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.885830] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2378.886048] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-00d6a85c-ca45-418b-b4e0-3e90bbd18f5e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.888121] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2378.888301] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2378.889232] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6df6af87-3c83-4772-97c0-0bad87a74e3d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.894057] env[67977]: DEBUG oslo_vmware.api [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Waiting for the task: (returnval){ [ 2378.894057] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52bed573-d837-b34b-19cb-1be20f905b63" [ 2378.894057] env[67977]: _type = "Task" [ 2378.894057] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2378.908249] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2378.908505] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Creating directory with path [datastore1] vmware_temp/664d5616-2b5a-48cc-8deb-03fb43285c16/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2378.908716] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb073a90-bc03-4c90-b1ac-fa321efb7b28 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.919338] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Created directory with path [datastore1] vmware_temp/664d5616-2b5a-48cc-8deb-03fb43285c16/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2378.919534] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Fetch image to [datastore1] vmware_temp/664d5616-2b5a-48cc-8deb-03fb43285c16/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2378.919701] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/664d5616-2b5a-48cc-8deb-03fb43285c16/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2378.920438] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-886ab58d-4b0d-4d34-960b-ecb4c3c84756 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.926855] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04192d37-4484-4aac-a6f7-31f4c16da896 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.935612] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d80cd119-8219-46e1-bbe6-d50d9b2b928e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.969103] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07f808ce-d839-4e34-b94e-5584932bd017 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.971760] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2378.971952] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2378.972139] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Deleting the datastore file [datastore1] ad0b21ff-90be-4a78-8cc7-b347df8579a9 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2378.972392] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b303e0f2-8e3d-42d6-95d3-27b7c409100d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.977063] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a09c7353-502b-4a20-813c-f23a0d0b2f6b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2378.979650] env[67977]: DEBUG oslo_vmware.api [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for the task: (returnval){ [ 2378.979650] env[67977]: value = "task-3468312" [ 2378.979650] env[67977]: _type = "Task" [ 2378.979650] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2378.986941] env[67977]: DEBUG oslo_vmware.api [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': task-3468312, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2379.005477] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2379.056685] env[67977]: DEBUG oslo_vmware.rw_handles [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/664d5616-2b5a-48cc-8deb-03fb43285c16/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2379.115574] env[67977]: DEBUG oslo_vmware.rw_handles [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2379.115738] env[67977]: DEBUG oslo_vmware.rw_handles [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/664d5616-2b5a-48cc-8deb-03fb43285c16/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2379.490301] env[67977]: DEBUG oslo_vmware.api [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Task: {'id': task-3468312, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069321} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2379.490836] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2379.490836] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2379.491117] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2379.491359] env[67977]: INFO nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2379.493506] env[67977]: DEBUG nova.compute.claims [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2379.493766] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2379.494046] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2379.633336] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03aa911a-db86-405c-bed2-e2c07bad1e14 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.640514] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39046895-30a7-41cd-8d99-d001fb7c27ab {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.669975] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6147bec0-bd11-4109-9cf4-e43527b8d60c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.676500] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3392ea54-b0e4-49be-9f83-b2fe9a6c7010 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.689714] env[67977]: DEBUG nova.compute.provider_tree [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2379.697837] env[67977]: DEBUG nova.scheduler.client.report [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2379.710662] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.217s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2379.711251] env[67977]: ERROR nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2379.711251] env[67977]: Faults: ['InvalidArgument'] [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Traceback (most recent call last): [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] self.driver.spawn(context, instance, image_meta, [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] self._fetch_image_if_missing(context, vi) [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] image_cache(vi, tmp_image_ds_loc) [ 2379.711251] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] vm_util.copy_virtual_disk( [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] session._wait_for_task(vmdk_copy_task) [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] return self.wait_for_task(task_ref) [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] return evt.wait() [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] result = hub.switch() [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] return self.greenlet.switch() [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2379.711544] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] self.f(*self.args, **self.kw) [ 2379.711827] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2379.711827] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] raise exceptions.translate_fault(task_info.error) [ 2379.711827] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2379.711827] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Faults: ['InvalidArgument'] [ 2379.711827] env[67977]: ERROR nova.compute.manager [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] [ 2379.712221] env[67977]: DEBUG nova.compute.utils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2379.713792] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Build of instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 was re-scheduled: A specified parameter was not correct: fileType [ 2379.713792] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2379.714184] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2379.714354] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2379.714522] env[67977]: DEBUG nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2379.714682] env[67977]: DEBUG nova.network.neutron [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2379.982103] env[67977]: DEBUG nova.network.neutron [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2379.993133] env[67977]: INFO nova.compute.manager [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Took 0.28 seconds to deallocate network for instance. [ 2380.092435] env[67977]: INFO nova.scheduler.client.report [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Deleted allocations for instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 [ 2380.116311] env[67977]: DEBUG oslo_concurrency.lockutils [None req-449e60d1-eb33-4675-ac08-5edf3212e56c tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 592.053s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2380.116428] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 475.389s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2380.116534] env[67977]: INFO nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] During sync_power_state the instance has a pending task (spawning). Skip. [ 2380.116739] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2380.117212] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 395.833s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2380.118025] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Acquiring lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2380.118025] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2380.118025] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2380.119837] env[67977]: INFO nova.compute.manager [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Terminating instance [ 2380.121500] env[67977]: DEBUG nova.compute.manager [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2380.121696] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2380.122170] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ca55ec70-1dba-4d1c-99a1-0759cc6077e4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.131081] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-803493f3-0789-4c6b-8b14-48af9acec8f0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.159163] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ad0b21ff-90be-4a78-8cc7-b347df8579a9 could not be found. [ 2380.159361] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2380.159540] env[67977]: INFO nova.compute.manager [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2380.159773] env[67977]: DEBUG oslo.service.loopingcall [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2380.159982] env[67977]: DEBUG nova.compute.manager [-] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2380.160115] env[67977]: DEBUG nova.network.neutron [-] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2380.183623] env[67977]: DEBUG nova.network.neutron [-] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2380.192081] env[67977]: INFO nova.compute.manager [-] [instance: ad0b21ff-90be-4a78-8cc7-b347df8579a9] Took 0.03 seconds to deallocate network for instance. [ 2380.281067] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b848bcf0-300f-47d9-961b-b08d6b8e9468 tempest-ServersTestJSON-1986579007 tempest-ServersTestJSON-1986579007-project-member] Lock "ad0b21ff-90be-4a78-8cc7-b347df8579a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.163s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2398.118845] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Acquiring lock "53c9934a-21cb-4b10-9a3a-c0252138eb28" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2398.119158] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Lock "53c9934a-21cb-4b10-9a3a-c0252138eb28" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2398.129384] env[67977]: DEBUG nova.compute.manager [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2398.181114] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2398.181480] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2398.183399] env[67977]: INFO nova.compute.claims [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2398.341344] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58fa7adb-4f05-4e0a-9ceb-a1607a761224 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.349345] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-877faf64-3b16-4a61-b50b-f2288c0c45aa {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.379611] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ce6b2de-5cc4-4143-81ab-e5c1e19dc1a0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.387011] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce6a6e45-67df-4a16-b8ea-a357f89b0a34 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.399968] env[67977]: DEBUG nova.compute.provider_tree [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2398.411547] env[67977]: DEBUG nova.scheduler.client.report [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2398.429249] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2398.429762] env[67977]: DEBUG nova.compute.manager [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2398.466584] env[67977]: DEBUG nova.compute.utils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2398.472019] env[67977]: DEBUG nova.compute.manager [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2398.472019] env[67977]: DEBUG nova.network.neutron [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2398.481262] env[67977]: DEBUG nova.compute.manager [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2398.545597] env[67977]: DEBUG nova.policy [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52f7dac3375c46edb1606f71743c0e8c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dd6cca838c794b61b43b32bb8b72f6b9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 2398.550625] env[67977]: DEBUG nova.compute.manager [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2398.577079] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2398.577434] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2398.577644] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2398.577849] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2398.577987] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2398.578146] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2398.578350] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2398.578507] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2398.578675] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2398.578830] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2398.578999] env[67977]: DEBUG nova.virt.hardware [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2398.579884] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc1ea924-b4bf-41dc-a84e-b2ba49cc89a2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.588306] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7677dac3-ae6c-412a-a880-cc54fdb5adec {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.833642] env[67977]: DEBUG nova.network.neutron [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Successfully created port: e2086c20-77b0-4f78-8986-2ebfda567e3a {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2399.412139] env[67977]: DEBUG nova.compute.manager [req-22fe1afe-551e-4f07-8d84-fc763ce54f9d req-a5f9195f-a3ca-4128-ab68-c0c042bdac97 service nova] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Received event network-vif-plugged-e2086c20-77b0-4f78-8986-2ebfda567e3a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2399.412139] env[67977]: DEBUG oslo_concurrency.lockutils [req-22fe1afe-551e-4f07-8d84-fc763ce54f9d req-a5f9195f-a3ca-4128-ab68-c0c042bdac97 service nova] Acquiring lock "53c9934a-21cb-4b10-9a3a-c0252138eb28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2399.412139] env[67977]: DEBUG oslo_concurrency.lockutils [req-22fe1afe-551e-4f07-8d84-fc763ce54f9d req-a5f9195f-a3ca-4128-ab68-c0c042bdac97 service nova] Lock "53c9934a-21cb-4b10-9a3a-c0252138eb28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2399.412139] env[67977]: DEBUG oslo_concurrency.lockutils [req-22fe1afe-551e-4f07-8d84-fc763ce54f9d req-a5f9195f-a3ca-4128-ab68-c0c042bdac97 service nova] Lock "53c9934a-21cb-4b10-9a3a-c0252138eb28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2399.412484] env[67977]: DEBUG nova.compute.manager [req-22fe1afe-551e-4f07-8d84-fc763ce54f9d req-a5f9195f-a3ca-4128-ab68-c0c042bdac97 service nova] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] No waiting events found dispatching network-vif-plugged-e2086c20-77b0-4f78-8986-2ebfda567e3a {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2399.412484] env[67977]: WARNING nova.compute.manager [req-22fe1afe-551e-4f07-8d84-fc763ce54f9d req-a5f9195f-a3ca-4128-ab68-c0c042bdac97 service nova] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Received unexpected event network-vif-plugged-e2086c20-77b0-4f78-8986-2ebfda567e3a for instance with vm_state building and task_state spawning. [ 2399.527508] env[67977]: DEBUG nova.network.neutron [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Successfully updated port: e2086c20-77b0-4f78-8986-2ebfda567e3a {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2399.538259] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Acquiring lock "refresh_cache-53c9934a-21cb-4b10-9a3a-c0252138eb28" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2399.538259] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Acquired lock "refresh_cache-53c9934a-21cb-4b10-9a3a-c0252138eb28" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2399.538259] env[67977]: DEBUG nova.network.neutron [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2399.610281] env[67977]: DEBUG nova.network.neutron [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2399.970668] env[67977]: DEBUG nova.network.neutron [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Updating instance_info_cache with network_info: [{"id": "e2086c20-77b0-4f78-8986-2ebfda567e3a", "address": "fa:16:3e:4d:5b:a5", "network": {"id": "d5bada29-06ef-44bb-b2f1-453058cf58d0", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-403508856-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dd6cca838c794b61b43b32bb8b72f6b9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "65e4a2b4-fd64-4ac9-b2ec-bac768b501c5", "external-id": "nsx-vlan-transportzone-449", "segmentation_id": 449, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2086c20-77", "ovs_interfaceid": "e2086c20-77b0-4f78-8986-2ebfda567e3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2399.983014] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Releasing lock "refresh_cache-53c9934a-21cb-4b10-9a3a-c0252138eb28" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2399.983434] env[67977]: DEBUG nova.compute.manager [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Instance network_info: |[{"id": "e2086c20-77b0-4f78-8986-2ebfda567e3a", "address": "fa:16:3e:4d:5b:a5", "network": {"id": "d5bada29-06ef-44bb-b2f1-453058cf58d0", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-403508856-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dd6cca838c794b61b43b32bb8b72f6b9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "65e4a2b4-fd64-4ac9-b2ec-bac768b501c5", "external-id": "nsx-vlan-transportzone-449", "segmentation_id": 449, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2086c20-77", "ovs_interfaceid": "e2086c20-77b0-4f78-8986-2ebfda567e3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2399.983713] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4d:5b:a5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '65e4a2b4-fd64-4ac9-b2ec-bac768b501c5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e2086c20-77b0-4f78-8986-2ebfda567e3a', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2399.991915] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Creating folder: Project (dd6cca838c794b61b43b32bb8b72f6b9). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2399.992420] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cef4e5a9-5fb3-4be1-83f0-17e0e622be3c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2400.003099] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Created folder: Project (dd6cca838c794b61b43b32bb8b72f6b9) in parent group-v693022. [ 2400.003275] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Creating folder: Instances. Parent ref: group-v693127. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2400.003498] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-79b843a7-d50a-4752-b23c-7d05142450a0 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2400.011065] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Created folder: Instances in parent group-v693127. [ 2400.011287] env[67977]: DEBUG oslo.service.loopingcall [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2400.011455] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2400.011631] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4c02cec1-8b75-4e84-9db2-5640e4049d28 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2400.028684] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2400.028684] env[67977]: value = "task-3468315" [ 2400.028684] env[67977]: _type = "Task" [ 2400.028684] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2400.035511] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468315, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2400.539136] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468315, 'name': CreateVM_Task} progress is 25%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2401.038941] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468315, 'name': CreateVM_Task, 'duration_secs': 0.617532} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2401.041067] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2401.041067] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2401.041067] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2401.041067] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2401.041067] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4296ce6d-1598-4cce-bc7c-a32e4ea7ad7f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2401.045214] env[67977]: DEBUG oslo_vmware.api [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Waiting for the task: (returnval){ [ 2401.045214] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]527653cd-1ba0-fb08-bc63-150d2c07f477" [ 2401.045214] env[67977]: _type = "Task" [ 2401.045214] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2401.053647] env[67977]: DEBUG oslo_vmware.api [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]527653cd-1ba0-fb08-bc63-150d2c07f477, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2401.439229] env[67977]: DEBUG nova.compute.manager [req-b9e1bbda-37bb-4e71-953e-6ea26f40432d req-30924f69-4004-4865-b583-88a459dd5288 service nova] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Received event network-changed-e2086c20-77b0-4f78-8986-2ebfda567e3a {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2401.439229] env[67977]: DEBUG nova.compute.manager [req-b9e1bbda-37bb-4e71-953e-6ea26f40432d req-30924f69-4004-4865-b583-88a459dd5288 service nova] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Refreshing instance network info cache due to event network-changed-e2086c20-77b0-4f78-8986-2ebfda567e3a. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2401.439229] env[67977]: DEBUG oslo_concurrency.lockutils [req-b9e1bbda-37bb-4e71-953e-6ea26f40432d req-30924f69-4004-4865-b583-88a459dd5288 service nova] Acquiring lock "refresh_cache-53c9934a-21cb-4b10-9a3a-c0252138eb28" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2401.439338] env[67977]: DEBUG oslo_concurrency.lockutils [req-b9e1bbda-37bb-4e71-953e-6ea26f40432d req-30924f69-4004-4865-b583-88a459dd5288 service nova] Acquired lock "refresh_cache-53c9934a-21cb-4b10-9a3a-c0252138eb28" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2401.439538] env[67977]: DEBUG nova.network.neutron [req-b9e1bbda-37bb-4e71-953e-6ea26f40432d req-30924f69-4004-4865-b583-88a459dd5288 service nova] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Refreshing network info cache for port e2086c20-77b0-4f78-8986-2ebfda567e3a {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2401.556194] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2401.556552] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2401.556665] env[67977]: DEBUG oslo_concurrency.lockutils [None req-41e11671-5bfc-498f-a138-440424246d3b tempest-ServerMetadataNegativeTestJSON-781248492 tempest-ServerMetadataNegativeTestJSON-781248492-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2401.671884] env[67977]: DEBUG nova.network.neutron [req-b9e1bbda-37bb-4e71-953e-6ea26f40432d req-30924f69-4004-4865-b583-88a459dd5288 service nova] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Updated VIF entry in instance network info cache for port e2086c20-77b0-4f78-8986-2ebfda567e3a. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2401.672291] env[67977]: DEBUG nova.network.neutron [req-b9e1bbda-37bb-4e71-953e-6ea26f40432d req-30924f69-4004-4865-b583-88a459dd5288 service nova] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Updating instance_info_cache with network_info: [{"id": "e2086c20-77b0-4f78-8986-2ebfda567e3a", "address": "fa:16:3e:4d:5b:a5", "network": {"id": "d5bada29-06ef-44bb-b2f1-453058cf58d0", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-403508856-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dd6cca838c794b61b43b32bb8b72f6b9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "65e4a2b4-fd64-4ac9-b2ec-bac768b501c5", "external-id": "nsx-vlan-transportzone-449", "segmentation_id": 449, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2086c20-77", "ovs_interfaceid": "e2086c20-77b0-4f78-8986-2ebfda567e3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2401.681584] env[67977]: DEBUG oslo_concurrency.lockutils [req-b9e1bbda-37bb-4e71-953e-6ea26f40432d req-30924f69-4004-4865-b583-88a459dd5288 service nova] Releasing lock "refresh_cache-53c9934a-21cb-4b10-9a3a-c0252138eb28" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2427.831667] env[67977]: WARNING oslo_vmware.rw_handles [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2427.831667] env[67977]: ERROR oslo_vmware.rw_handles [ 2427.832637] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/664d5616-2b5a-48cc-8deb-03fb43285c16/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2427.834190] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2427.834546] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Copying Virtual Disk [datastore1] vmware_temp/664d5616-2b5a-48cc-8deb-03fb43285c16/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/664d5616-2b5a-48cc-8deb-03fb43285c16/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2427.834753] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9c0a2b6d-8a0b-448e-a32d-79774f2dcedc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2427.842889] env[67977]: DEBUG oslo_vmware.api [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Waiting for the task: (returnval){ [ 2427.842889] env[67977]: value = "task-3468316" [ 2427.842889] env[67977]: _type = "Task" [ 2427.842889] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2427.851111] env[67977]: DEBUG oslo_vmware.api [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Task: {'id': task-3468316, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2428.353028] env[67977]: DEBUG oslo_vmware.exceptions [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2428.353299] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2428.353874] env[67977]: ERROR nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2428.353874] env[67977]: Faults: ['InvalidArgument'] [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Traceback (most recent call last): [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] yield resources [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] self.driver.spawn(context, instance, image_meta, [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] self._fetch_image_if_missing(context, vi) [ 2428.353874] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] image_cache(vi, tmp_image_ds_loc) [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] vm_util.copy_virtual_disk( [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] session._wait_for_task(vmdk_copy_task) [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] return self.wait_for_task(task_ref) [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] return evt.wait() [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] result = hub.switch() [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2428.354300] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] return self.greenlet.switch() [ 2428.354706] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2428.354706] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] self.f(*self.args, **self.kw) [ 2428.354706] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2428.354706] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] raise exceptions.translate_fault(task_info.error) [ 2428.354706] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2428.354706] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Faults: ['InvalidArgument'] [ 2428.354706] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] [ 2428.354706] env[67977]: INFO nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Terminating instance [ 2428.355736] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2428.355949] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2428.356196] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-76563a64-4158-4c7a-84f6-8ce741f44a37 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.358462] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2428.358650] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2428.359345] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cea70edf-7210-43d5-a55b-2420ea109e4e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.365890] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2428.366105] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3d1ea10f-c3a5-4af7-a49c-3d7d06bdead5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.368172] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2428.368348] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2428.369305] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d1a0c3b4-f358-4b8a-b051-2fb10e553c4e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.374257] env[67977]: DEBUG oslo_vmware.api [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 2428.374257] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d8131c-9f6c-970e-0075-d41fa3ba4486" [ 2428.374257] env[67977]: _type = "Task" [ 2428.374257] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2428.381012] env[67977]: DEBUG oslo_vmware.api [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52d8131c-9f6c-970e-0075-d41fa3ba4486, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2428.435730] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2428.435948] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2428.436155] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Deleting the datastore file [datastore1] 98f7c8cc-b27f-406c-b34d-22c2c4609e24 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2428.436416] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c8f5d3be-4f06-4a5c-bb04-74d872e9237d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.442216] env[67977]: DEBUG oslo_vmware.api [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Waiting for the task: (returnval){ [ 2428.442216] env[67977]: value = "task-3468318" [ 2428.442216] env[67977]: _type = "Task" [ 2428.442216] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2428.449823] env[67977]: DEBUG oslo_vmware.api [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Task: {'id': task-3468318, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2428.884796] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2428.885079] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating directory with path [datastore1] vmware_temp/e6aa20d8-2037-42b5-8ca4-4115abbbfc8d/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2428.885312] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-29ba14c2-0eec-4daa-9f86-a4950c6fae40 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.898907] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Created directory with path [datastore1] vmware_temp/e6aa20d8-2037-42b5-8ca4-4115abbbfc8d/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2428.899158] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Fetch image to [datastore1] vmware_temp/e6aa20d8-2037-42b5-8ca4-4115abbbfc8d/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2428.899330] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/e6aa20d8-2037-42b5-8ca4-4115abbbfc8d/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2428.900129] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a761b79-81c7-4926-8b60-ea94b3033c9b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.906998] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80d1d039-6137-42c4-acde-2728c084c2f4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.917255] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a820142-f2de-4f45-b5f7-4db68aad44ca {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.950842] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c33924aa-ee1f-4731-aa80-918297179075 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.957879] env[67977]: DEBUG oslo_vmware.api [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Task: {'id': task-3468318, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062128} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2428.959348] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2428.959540] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2428.959713] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2428.959888] env[67977]: INFO nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2428.961636] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-70d6b2ce-5554-4bcf-a3ca-648d06d58407 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2428.963469] env[67977]: DEBUG nova.compute.claims [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2428.963654] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2428.963879] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2428.988363] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2429.041892] env[67977]: DEBUG oslo_vmware.rw_handles [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e6aa20d8-2037-42b5-8ca4-4115abbbfc8d/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2429.096052] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2429.100729] env[67977]: DEBUG oslo_vmware.rw_handles [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2429.100911] env[67977]: DEBUG oslo_vmware.rw_handles [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e6aa20d8-2037-42b5-8ca4-4115abbbfc8d/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2429.160216] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bf16608-0fde-46f2-84a4-efc56faa5000 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2429.168456] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa088209-9dfb-4440-b687-82e4dd7930ea {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2429.199713] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36540db1-5b65-42f9-ae8a-a746704064c3 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2429.207034] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f04676f-f074-43ae-a85e-2ca5022d0bf2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2429.220113] env[67977]: DEBUG nova.compute.provider_tree [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2429.228499] env[67977]: DEBUG nova.scheduler.client.report [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2429.242345] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.278s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2429.242895] env[67977]: ERROR nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2429.242895] env[67977]: Faults: ['InvalidArgument'] [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Traceback (most recent call last): [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] self.driver.spawn(context, instance, image_meta, [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] self._fetch_image_if_missing(context, vi) [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] image_cache(vi, tmp_image_ds_loc) [ 2429.242895] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] vm_util.copy_virtual_disk( [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] session._wait_for_task(vmdk_copy_task) [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] return self.wait_for_task(task_ref) [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] return evt.wait() [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] result = hub.switch() [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] return self.greenlet.switch() [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2429.243256] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] self.f(*self.args, **self.kw) [ 2429.243574] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2429.243574] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] raise exceptions.translate_fault(task_info.error) [ 2429.243574] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2429.243574] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Faults: ['InvalidArgument'] [ 2429.243574] env[67977]: ERROR nova.compute.manager [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] [ 2429.243691] env[67977]: DEBUG nova.compute.utils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2429.245013] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Build of instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 was re-scheduled: A specified parameter was not correct: fileType [ 2429.245013] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2429.245398] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2429.245580] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2429.245791] env[67977]: DEBUG nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2429.245959] env[67977]: DEBUG nova.network.neutron [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2429.586207] env[67977]: DEBUG nova.network.neutron [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2429.599022] env[67977]: INFO nova.compute.manager [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Took 0.35 seconds to deallocate network for instance. [ 2429.688125] env[67977]: INFO nova.scheduler.client.report [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Deleted allocations for instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 [ 2429.708819] env[67977]: DEBUG oslo_concurrency.lockutils [None req-17dade0b-dced-47cb-91a0-44a5fc65df25 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 521.123s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2429.709099] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 324.769s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2429.709326] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2429.709534] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2429.709697] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2429.711942] env[67977]: INFO nova.compute.manager [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Terminating instance [ 2429.713668] env[67977]: DEBUG nova.compute.manager [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2429.713809] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2429.714309] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bbf55c63-f85f-424d-8eb0-188cf5fdcc91 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2429.725031] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06561be5-5306-40bd-8e12-5e405906e55a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2429.752184] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 98f7c8cc-b27f-406c-b34d-22c2c4609e24 could not be found. [ 2429.752389] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2429.752564] env[67977]: INFO nova.compute.manager [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2429.752798] env[67977]: DEBUG oslo.service.loopingcall [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2429.753014] env[67977]: DEBUG nova.compute.manager [-] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2429.753125] env[67977]: DEBUG nova.network.neutron [-] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2429.775425] env[67977]: DEBUG nova.network.neutron [-] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2429.777074] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2429.783600] env[67977]: INFO nova.compute.manager [-] [instance: 98f7c8cc-b27f-406c-b34d-22c2c4609e24] Took 0.03 seconds to deallocate network for instance. [ 2429.868761] env[67977]: DEBUG oslo_concurrency.lockutils [None req-b6766a5a-9ff7-4409-9241-0e8e5505fc72 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Lock "98f7c8cc-b27f-406c-b34d-22c2c4609e24" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2430.775947] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2431.776233] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2431.776570] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2435.776400] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2435.776757] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2436.771068] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2436.774511] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2436.774673] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2436.774798] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2436.792037] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2436.792314] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2436.792314] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 396fd258-dc89-4392-a487-921958012e92] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2436.792440] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2436.792561] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2436.792682] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2436.792799] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2436.792916] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2439.775411] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2439.786485] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2439.786707] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2439.786897] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2439.787070] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2439.788169] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2da81ebe-2696-45a1-a4e9-163e36681aef {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.798175] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6656b18-4c29-4007-9809-0bcb9470015b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.817620] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-826c52c4-f2ee-4298-a1a6-f1ff4ed52845 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.826139] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb0547ad-839e-4832-abc0-76f2be5f6d10 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.862920] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180939MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2439.863077] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2439.863270] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2439.921300] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2439.921467] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2439.921597] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 396fd258-dc89-4392-a487-921958012e92 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2439.921722] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 19f06190-ebc3-4089-81d4-1ac09ec46b46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2439.921844] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2439.921962] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d4339fd1-c694-4349-bc12-4d915fa23079 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2439.922113] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 53c9934a-21cb-4b10-9a3a-c0252138eb28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2439.922314] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2439.922453] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2440.016886] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-337b21fd-64fa-4e42-88c4-eaea1acc92ee {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.024635] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9be06a22-bdc1-4e1e-8cfa-9dd9e32f9adc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.054758] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0ab319c-f0cf-41be-b5d0-d16f3498ef70 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.061986] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc16c1f4-1903-42e8-88cb-2b6e47474cd5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.074800] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2440.082683] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2440.095345] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2440.095566] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.232s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2475.445571] env[67977]: WARNING oslo_vmware.rw_handles [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2475.445571] env[67977]: ERROR oslo_vmware.rw_handles [ 2475.446208] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/e6aa20d8-2037-42b5-8ca4-4115abbbfc8d/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2475.448304] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2475.448567] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Copying Virtual Disk [datastore1] vmware_temp/e6aa20d8-2037-42b5-8ca4-4115abbbfc8d/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/e6aa20d8-2037-42b5-8ca4-4115abbbfc8d/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2475.448872] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-322dddd2-bb29-4e8f-b823-caff98f15501 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2475.457405] env[67977]: DEBUG oslo_vmware.api [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 2475.457405] env[67977]: value = "task-3468319" [ 2475.457405] env[67977]: _type = "Task" [ 2475.457405] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2475.464905] env[67977]: DEBUG oslo_vmware.api [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468319, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2475.968298] env[67977]: DEBUG oslo_vmware.exceptions [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2475.968780] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2475.969365] env[67977]: ERROR nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2475.969365] env[67977]: Faults: ['InvalidArgument'] [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Traceback (most recent call last): [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] yield resources [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] self.driver.spawn(context, instance, image_meta, [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] self._fetch_image_if_missing(context, vi) [ 2475.969365] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] image_cache(vi, tmp_image_ds_loc) [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] vm_util.copy_virtual_disk( [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] session._wait_for_task(vmdk_copy_task) [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] return self.wait_for_task(task_ref) [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] return evt.wait() [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] result = hub.switch() [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2475.969714] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] return self.greenlet.switch() [ 2475.970025] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2475.970025] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] self.f(*self.args, **self.kw) [ 2475.970025] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2475.970025] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] raise exceptions.translate_fault(task_info.error) [ 2475.970025] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2475.970025] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Faults: ['InvalidArgument'] [ 2475.970025] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] [ 2475.970025] env[67977]: INFO nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Terminating instance [ 2475.972068] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2475.972068] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2475.972068] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-415bf46e-71ae-4bd9-9fb0-5841fe6d2a99 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2475.974016] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2475.974225] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2475.974973] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d31db023-a399-4880-9e30-037af1af622d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2475.982179] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2475.983262] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0c70f15d-a4f3-41b6-acf2-8d4d6d469a16 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2475.984683] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2475.984856] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2475.985548] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4d5b358-1e93-4c10-8181-19d8ff97881f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2475.991906] env[67977]: DEBUG oslo_vmware.api [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 2475.991906] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5264007a-f443-4601-c8c1-2b8b1b87590f" [ 2475.991906] env[67977]: _type = "Task" [ 2475.991906] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2475.999508] env[67977]: DEBUG oslo_vmware.api [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]5264007a-f443-4601-c8c1-2b8b1b87590f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2476.064945] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2476.065110] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2476.065347] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleting the datastore file [datastore1] 142d3b29-b467-4007-84ac-8b7e0ee9e326 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2476.065615] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6767a7c1-dfc7-4e48-9693-a1ed078469a9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.072298] env[67977]: DEBUG oslo_vmware.api [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for the task: (returnval){ [ 2476.072298] env[67977]: value = "task-3468321" [ 2476.072298] env[67977]: _type = "Task" [ 2476.072298] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2476.080038] env[67977]: DEBUG oslo_vmware.api [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468321, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2476.501172] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2476.501440] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating directory with path [datastore1] vmware_temp/d6e97cfe-0029-4184-a172-8bc3a64cfce3/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2476.501680] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-00b91576-43e4-49f6-b795-9647b6491f20 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.514382] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Created directory with path [datastore1] vmware_temp/d6e97cfe-0029-4184-a172-8bc3a64cfce3/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2476.514590] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Fetch image to [datastore1] vmware_temp/d6e97cfe-0029-4184-a172-8bc3a64cfce3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2476.514760] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/d6e97cfe-0029-4184-a172-8bc3a64cfce3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2476.515545] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5595003-3885-4238-90e5-c971022f8029 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.522371] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddef9e89-be67-4b70-8c48-6b84336b31cb {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.531581] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9741bc0-7eeb-4858-a599-9c3ae55a356e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.562375] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acc45a2d-87f3-474a-8376-6e7ef5eb0d1d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.568407] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4c9e08ab-54f6-4f4c-b624-442602f329a8 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.581104] env[67977]: DEBUG oslo_vmware.api [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Task: {'id': task-3468321, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.0758} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2476.581347] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2476.581530] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2476.581700] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2476.581874] env[67977]: INFO nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2476.584085] env[67977]: DEBUG nova.compute.claims [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2476.584262] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2476.584480] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2476.594732] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2476.649803] env[67977]: DEBUG oslo_vmware.rw_handles [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6e97cfe-0029-4184-a172-8bc3a64cfce3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2476.711811] env[67977]: DEBUG oslo_vmware.rw_handles [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2476.712019] env[67977]: DEBUG oslo_vmware.rw_handles [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6e97cfe-0029-4184-a172-8bc3a64cfce3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2476.772556] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cf5b2c6-23df-4d1b-b456-3fd76c64b650 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.780267] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2316d78f-1181-4e54-9886-e286781a8822 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.809991] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55b7cc04-45c9-4f7a-9132-4c57eaae11e6 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.816847] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c77ccbc7-2322-48af-9586-c6b45e558187 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.829632] env[67977]: DEBUG nova.compute.provider_tree [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2476.838717] env[67977]: DEBUG nova.scheduler.client.report [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2476.854743] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.270s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2476.854976] env[67977]: ERROR nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2476.854976] env[67977]: Faults: ['InvalidArgument'] [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Traceback (most recent call last): [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] self.driver.spawn(context, instance, image_meta, [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] self._fetch_image_if_missing(context, vi) [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] image_cache(vi, tmp_image_ds_loc) [ 2476.854976] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] vm_util.copy_virtual_disk( [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] session._wait_for_task(vmdk_copy_task) [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] return self.wait_for_task(task_ref) [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] return evt.wait() [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] result = hub.switch() [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] return self.greenlet.switch() [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2476.855357] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] self.f(*self.args, **self.kw) [ 2476.855693] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2476.855693] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] raise exceptions.translate_fault(task_info.error) [ 2476.855693] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2476.855693] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Faults: ['InvalidArgument'] [ 2476.855693] env[67977]: ERROR nova.compute.manager [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] [ 2476.855693] env[67977]: DEBUG nova.compute.utils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2476.857070] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Build of instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 was re-scheduled: A specified parameter was not correct: fileType [ 2476.857070] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2476.857456] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2476.857629] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2476.857800] env[67977]: DEBUG nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2476.857962] env[67977]: DEBUG nova.network.neutron [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2477.263721] env[67977]: DEBUG nova.network.neutron [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2477.282063] env[67977]: INFO nova.compute.manager [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Took 0.42 seconds to deallocate network for instance. [ 2477.401427] env[67977]: INFO nova.scheduler.client.report [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Deleted allocations for instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 [ 2477.423598] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c23bade2-410d-4712-ba36-1387000cac2c tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "142d3b29-b467-4007-84ac-8b7e0ee9e326" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 526.115s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2477.423871] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "142d3b29-b467-4007-84ac-8b7e0ee9e326" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 329.246s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2477.424238] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "142d3b29-b467-4007-84ac-8b7e0ee9e326-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2477.424496] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "142d3b29-b467-4007-84ac-8b7e0ee9e326-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2477.424675] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "142d3b29-b467-4007-84ac-8b7e0ee9e326-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2477.426695] env[67977]: INFO nova.compute.manager [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Terminating instance [ 2477.428749] env[67977]: DEBUG nova.compute.manager [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2477.428959] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2477.429438] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ce989aa8-59d5-4b28-81a8-73859c74099e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2477.438806] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d644b9e-7c2a-49e9-bc62-d3ee6789c9af {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2477.467130] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 142d3b29-b467-4007-84ac-8b7e0ee9e326 could not be found. [ 2477.467348] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2477.467525] env[67977]: INFO nova.compute.manager [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2477.467762] env[67977]: DEBUG oslo.service.loopingcall [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2477.467973] env[67977]: DEBUG nova.compute.manager [-] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2477.468087] env[67977]: DEBUG nova.network.neutron [-] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2477.497044] env[67977]: DEBUG nova.network.neutron [-] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2477.505082] env[67977]: INFO nova.compute.manager [-] [instance: 142d3b29-b467-4007-84ac-8b7e0ee9e326] Took 0.04 seconds to deallocate network for instance. [ 2477.596918] env[67977]: DEBUG oslo_concurrency.lockutils [None req-9fa3d03e-4bcf-4c74-b027-4d8305708950 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Lock "142d3b29-b467-4007-84ac-8b7e0ee9e326" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.173s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2489.095581] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2489.775121] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2492.776593] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2492.776943] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2492.776981] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2493.771339] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2495.775625] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2495.775943] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2498.271185] env[67977]: DEBUG oslo_concurrency.lockutils [None req-90a1948a-c4fe-421c-9350-bbbfb03aed68 tempest-AttachInterfacesTestJSON-94217131 tempest-AttachInterfacesTestJSON-94217131-project-member] Acquiring lock "da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2498.694676] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2498.695206] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Getting list of instances from cluster (obj){ [ 2498.695206] env[67977]: value = "domain-c8" [ 2498.695206] env[67977]: _type = "ClusterComputeResource" [ 2498.695206] env[67977]: } {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2498.696294] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e38a7097-30a0-4c23-b2d6-13f5fe7a0305 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2498.710089] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Got total of 6 instances {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2498.801030] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2498.801233] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2498.801362] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2498.801423] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2498.818063] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2498.818233] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 396fd258-dc89-4392-a487-921958012e92] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2498.818404] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2498.818485] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2498.818607] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2498.818731] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2498.818853] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2500.775407] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2500.788535] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2500.788535] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2500.788728] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2500.788791] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67977) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2500.789895] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44131eff-b113-44a1-bb5f-adbe2abf9d78 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.798493] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96db9aeb-7d3a-4e60-aecc-84906d20d36d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.813260] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2f61432-182b-48f8-bf08-8da8a8d0111c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.819501] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9e70375-49c2-40be-9398-d0d931181be9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.847845] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180917MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67977) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2500.847989] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2500.848193] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2500.976571] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.976737] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 396fd258-dc89-4392-a487-921958012e92 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.976864] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 19f06190-ebc3-4089-81d4-1ac09ec46b46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.976985] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.977121] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance d4339fd1-c694-4349-bc12-4d915fa23079 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.977243] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Instance 53c9934a-21cb-4b10-9a3a-c0252138eb28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67977) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.977437] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2500.977583] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=67977) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2500.994041] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing inventories for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2501.007343] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating ProviderTree inventory for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2501.007555] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Updating inventory in ProviderTree for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2501.018837] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing aggregate associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, aggregates: None {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2501.037928] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Refreshing trait associations for resource provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67977) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2501.126918] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb0a1631-b223-46e8-ae34-f78a15216fb5 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2501.134971] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05a03e4a-10dc-4073-87ee-c29092a15b7a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2501.166037] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb347095-ff61-43d8-ba04-049863099206 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2501.173736] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b6ef5d0-fd37-47dc-861b-6b42bfc01a03 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2501.189095] env[67977]: DEBUG nova.compute.provider_tree [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2501.199774] env[67977]: DEBUG nova.scheduler.client.report [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2501.217975] env[67977]: DEBUG nova.compute.resource_tracker [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67977) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2501.218123] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.370s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2505.775838] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2505.776322] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances with incomplete migration {{(pid=67977) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2506.776081] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2509.784728] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2509.784728] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Cleaning up deleted instances {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2509.794925] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] There are 0 instances to clean {{(pid=67977) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2523.459419] env[67977]: WARNING oslo_vmware.rw_handles [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles response.begin() [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2523.459419] env[67977]: ERROR oslo_vmware.rw_handles [ 2523.460165] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Downloaded image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to vmware_temp/d6e97cfe-0029-4184-a172-8bc3a64cfce3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2523.461936] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Caching image {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2523.462197] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Copying Virtual Disk [datastore1] vmware_temp/d6e97cfe-0029-4184-a172-8bc3a64cfce3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk to [datastore1] vmware_temp/d6e97cfe-0029-4184-a172-8bc3a64cfce3/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk {{(pid=67977) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2523.462480] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7b5b7853-a352-4427-9d2e-8bcd7bb3e41c {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2523.471446] env[67977]: DEBUG oslo_vmware.api [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 2523.471446] env[67977]: value = "task-3468322" [ 2523.471446] env[67977]: _type = "Task" [ 2523.471446] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2523.478759] env[67977]: DEBUG oslo_vmware.api [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': task-3468322, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2523.981900] env[67977]: DEBUG oslo_vmware.exceptions [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Fault InvalidArgument not matched. {{(pid=67977) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2523.982195] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2523.982736] env[67977]: ERROR nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2523.982736] env[67977]: Faults: ['InvalidArgument'] [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Traceback (most recent call last): [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] yield resources [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] self.driver.spawn(context, instance, image_meta, [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] self._fetch_image_if_missing(context, vi) [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2523.982736] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] image_cache(vi, tmp_image_ds_loc) [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] vm_util.copy_virtual_disk( [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] session._wait_for_task(vmdk_copy_task) [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] return self.wait_for_task(task_ref) [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] return evt.wait() [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] result = hub.switch() [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] return self.greenlet.switch() [ 2523.983057] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2523.983636] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] self.f(*self.args, **self.kw) [ 2523.983636] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2523.983636] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] raise exceptions.translate_fault(task_info.error) [ 2523.983636] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2523.983636] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Faults: ['InvalidArgument'] [ 2523.983636] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] [ 2523.983636] env[67977]: INFO nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Terminating instance [ 2523.984579] env[67977]: DEBUG oslo_concurrency.lockutils [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2523.984787] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2523.985038] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-69259c35-1b9f-488d-92aa-be520dfc6276 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2523.987271] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2523.987457] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2523.988188] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e009b1a8-70d0-4748-9d31-3339d29a7f15 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2523.994869] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Unregistering the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2523.995090] env[67977]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f9e8f60f-f999-4ac1-9044-edb1d05ae554 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2523.999068] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2523.999238] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67977) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2523.999877] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b0901e1d-f487-435f-8423-aa724becfca1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2524.004325] env[67977]: DEBUG oslo_vmware.api [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Waiting for the task: (returnval){ [ 2524.004325] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]529b4b6e-5729-2d70-8506-4b0e910037e3" [ 2524.004325] env[67977]: _type = "Task" [ 2524.004325] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2524.018767] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Preparing fetch location {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2524.018969] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating directory with path [datastore1] vmware_temp/a7ff271a-f923-46e5-b9c5-460dcb9253e0/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2524.019178] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bf4fb45b-5d1d-4286-b185-4e539e064fe4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2524.050546] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Created directory with path [datastore1] vmware_temp/a7ff271a-f923-46e5-b9c5-460dcb9253e0/5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2524.050759] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Fetch image to [datastore1] vmware_temp/a7ff271a-f923-46e5-b9c5-460dcb9253e0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2524.050930] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to [datastore1] vmware_temp/a7ff271a-f923-46e5-b9c5-460dcb9253e0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk on the data store datastore1 {{(pid=67977) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2524.051736] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a812326-1236-46af-9453-4a8936872b64 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2524.059946] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-709ffd56-8f3e-4cef-85e4-b5bbfbdc9c0a {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2524.786737] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-590661f5-97b2-473d-bed7-6cd516277adc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2524.790592] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Unregistered the VM {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2524.790789] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Deleting contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2524.790964] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Deleting the datastore file [datastore1] 1536ad10-129b-439d-80c5-08fa92aeaed1 {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2524.791208] env[67977]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-005bce02-a52e-474c-8c0e-860d09890752 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2524.820073] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cd366d1-b2cf-4b56-aa06-434912855e43 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2524.822391] env[67977]: DEBUG oslo_vmware.api [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for the task: (returnval){ [ 2524.822391] env[67977]: value = "task-3468324" [ 2524.822391] env[67977]: _type = "Task" [ 2524.822391] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2524.827064] env[67977]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a4cbd895-e9bb-48a9-8c05-e9311e1c9f01 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2524.830986] env[67977]: DEBUG oslo_vmware.api [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': task-3468324, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2524.849555] env[67977]: DEBUG nova.virt.vmwareapi.images [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] [instance: 396fd258-dc89-4392-a487-921958012e92] Downloading image file data 5ac2bac3-6c5c-4005-b6b0-349a1330d017 to the data store datastore1 {{(pid=67977) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2524.899203] env[67977]: DEBUG oslo_vmware.rw_handles [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7ff271a-f923-46e5-b9c5-460dcb9253e0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2524.958157] env[67977]: DEBUG oslo_vmware.rw_handles [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Completed reading data from the image iterator. {{(pid=67977) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2524.958364] env[67977]: DEBUG oslo_vmware.rw_handles [None req-e2f70422-a190-4bf6-8cbd-b48e41b4d41d tempest-ServerDiskConfigTestJSON-233315990 tempest-ServerDiskConfigTestJSON-233315990-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7ff271a-f923-46e5-b9c5-460dcb9253e0/5ac2bac3-6c5c-4005-b6b0-349a1330d017/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67977) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2525.333112] env[67977]: DEBUG oslo_vmware.api [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Task: {'id': task-3468324, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069871} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2525.333432] env[67977]: DEBUG nova.virt.vmwareapi.ds_util [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Deleted the datastore file {{(pid=67977) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2525.333640] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Deleted contents of the VM from datastore datastore1 {{(pid=67977) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2525.333844] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2525.334070] env[67977]: INFO nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Took 1.35 seconds to destroy the instance on the hypervisor. [ 2525.336316] env[67977]: DEBUG nova.compute.claims [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Aborting claim: {{(pid=67977) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2525.336534] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2525.336791] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2525.463944] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ba39026-1478-4f46-989e-6ea147695b5f {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.470647] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33acf170-7694-4b56-a353-27dcc5fcb118 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.501386] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20af9e36-0284-479a-8682-f0d58997e669 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.509042] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4770ebcd-4eb8-4f3b-8aca-1bdc0af68291 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.521472] env[67977]: DEBUG nova.compute.provider_tree [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2525.529992] env[67977]: DEBUG nova.scheduler.client.report [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2525.543153] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.206s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2525.543688] env[67977]: ERROR nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2525.543688] env[67977]: Faults: ['InvalidArgument'] [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Traceback (most recent call last): [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] self.driver.spawn(context, instance, image_meta, [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] self._fetch_image_if_missing(context, vi) [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] image_cache(vi, tmp_image_ds_loc) [ 2525.543688] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] vm_util.copy_virtual_disk( [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] session._wait_for_task(vmdk_copy_task) [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] return self.wait_for_task(task_ref) [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] return evt.wait() [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] result = hub.switch() [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] return self.greenlet.switch() [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2525.544298] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] self.f(*self.args, **self.kw) [ 2525.545103] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2525.545103] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] raise exceptions.translate_fault(task_info.error) [ 2525.545103] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2525.545103] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Faults: ['InvalidArgument'] [ 2525.545103] env[67977]: ERROR nova.compute.manager [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] [ 2525.545103] env[67977]: DEBUG nova.compute.utils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] VimFaultException {{(pid=67977) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2525.545887] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Build of instance 1536ad10-129b-439d-80c5-08fa92aeaed1 was re-scheduled: A specified parameter was not correct: fileType [ 2525.545887] env[67977]: Faults: ['InvalidArgument'] {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2525.546383] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Unplugging VIFs for instance {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2525.546640] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67977) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2525.546813] env[67977]: DEBUG nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2525.547174] env[67977]: DEBUG nova.network.neutron [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2525.982485] env[67977]: DEBUG nova.network.neutron [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2525.994139] env[67977]: INFO nova.compute.manager [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Took 0.45 seconds to deallocate network for instance. [ 2526.090280] env[67977]: INFO nova.scheduler.client.report [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Deleted allocations for instance 1536ad10-129b-439d-80c5-08fa92aeaed1 [ 2526.114585] env[67977]: DEBUG oslo_concurrency.lockutils [None req-386851c1-8082-4d3c-a88d-13a2800dcebb tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "1536ad10-129b-439d-80c5-08fa92aeaed1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 525.295s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2526.114862] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "1536ad10-129b-439d-80c5-08fa92aeaed1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 328.920s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2526.115099] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Acquiring lock "1536ad10-129b-439d-80c5-08fa92aeaed1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2526.115310] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "1536ad10-129b-439d-80c5-08fa92aeaed1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2526.115476] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "1536ad10-129b-439d-80c5-08fa92aeaed1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2526.117716] env[67977]: INFO nova.compute.manager [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Terminating instance [ 2526.120047] env[67977]: DEBUG nova.compute.manager [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Start destroying the instance on the hypervisor. {{(pid=67977) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2526.120047] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Destroying instance {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2526.120577] env[67977]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7b7fa3e2-8265-44a4-8eff-6a951606f2ed {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2526.129995] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-986490d0-c4e0-4c48-921d-8bbaa8465850 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2526.155971] env[67977]: WARNING nova.virt.vmwareapi.vmops [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1536ad10-129b-439d-80c5-08fa92aeaed1 could not be found. [ 2526.156332] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Instance destroyed {{(pid=67977) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2526.156373] env[67977]: INFO nova.compute.manager [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2526.156633] env[67977]: DEBUG oslo.service.loopingcall [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2526.156856] env[67977]: DEBUG nova.compute.manager [-] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Deallocating network for instance {{(pid=67977) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2526.156956] env[67977]: DEBUG nova.network.neutron [-] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] deallocate_for_instance() {{(pid=67977) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2526.180132] env[67977]: DEBUG nova.network.neutron [-] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Updating instance_info_cache with network_info: [] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2526.188209] env[67977]: INFO nova.compute.manager [-] [instance: 1536ad10-129b-439d-80c5-08fa92aeaed1] Took 0.03 seconds to deallocate network for instance. [ 2526.275340] env[67977]: DEBUG oslo_concurrency.lockutils [None req-5c5b3c12-9a69-41cd-97e1-73d709496b9a tempest-ImagesTestJSON-43342777 tempest-ImagesTestJSON-43342777-project-member] Lock "1536ad10-129b-439d-80c5-08fa92aeaed1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.160s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2526.682186] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2526.697832] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Getting list of instances from cluster (obj){ [ 2526.697832] env[67977]: value = "domain-c8" [ 2526.697832] env[67977]: _type = "ClusterComputeResource" [ 2526.697832] env[67977]: } {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2526.699168] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34c7871e-e14f-44e6-873b-76c75b6f471d {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2526.712234] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Got total of 5 instances {{(pid=67977) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2526.712406] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 396fd258-dc89-4392-a487-921958012e92 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2526.712595] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 19f06190-ebc3-4089-81d4-1ac09ec46b46 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2526.712757] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2526.712911] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid d4339fd1-c694-4349-bc12-4d915fa23079 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2526.713073] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Triggering sync for uuid 53c9934a-21cb-4b10-9a3a-c0252138eb28 {{(pid=67977) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2526.713373] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "396fd258-dc89-4392-a487-921958012e92" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2526.713598] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "19f06190-ebc3-4089-81d4-1ac09ec46b46" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2526.713800] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2526.713995] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "d4339fd1-c694-4349-bc12-4d915fa23079" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2526.714204] env[67977]: DEBUG oslo_concurrency.lockutils [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Acquiring lock "53c9934a-21cb-4b10-9a3a-c0252138eb28" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2541.345889] env[67977]: DEBUG oslo_concurrency.lockutils [None req-389216da-177e-4e51-ae10-f6f5567ec3e1 tempest-DeleteServersTestJSON-367932552 tempest-DeleteServersTestJSON-367932552-project-member] Acquiring lock "d4339fd1-c694-4349-bc12-4d915fa23079" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2549.353109] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Acquiring lock "e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2549.354392] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Lock "e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2549.365088] env[67977]: DEBUG nova.compute.manager [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Starting instance... {{(pid=67977) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2549.417925] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2549.418205] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2549.419657] env[67977]: INFO nova.compute.claims [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2549.546175] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8b153d3-f29f-4c08-ad99-b3308227a8a1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2549.554071] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f6a0fb0-e576-487c-951a-680841cf78fd {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2549.585165] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-578ea523-172a-46f9-9d92-604e5839cecc {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2549.592176] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69a50454-45f5-42f4-b4fe-853ef3dfefc1 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2549.605690] env[67977]: DEBUG nova.compute.provider_tree [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Inventory has not changed in ProviderTree for provider: cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 {{(pid=67977) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2549.614193] env[67977]: DEBUG nova.scheduler.client.report [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Inventory has not changed for provider cc0e1e8e-b80b-4fe6-acfc-fb8ff2bc4382 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67977) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2549.628905] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2549.629395] env[67977]: DEBUG nova.compute.manager [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Start building networks asynchronously for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2549.660831] env[67977]: DEBUG nova.compute.utils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Using /dev/sd instead of None {{(pid=67977) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2549.662052] env[67977]: DEBUG nova.compute.manager [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Allocating IP information in the background. {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2549.662247] env[67977]: DEBUG nova.network.neutron [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] allocate_for_instance() {{(pid=67977) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2549.672065] env[67977]: DEBUG nova.compute.manager [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Start building block device mappings for instance. {{(pid=67977) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2549.741154] env[67977]: DEBUG nova.compute.manager [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Start spawning the instance on the hypervisor. {{(pid=67977) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2549.755468] env[67977]: DEBUG nova.policy [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '29a7dce32e674cfe83e5ba1462be3169', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '35a592e9754f4185ae803605651eda8a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67977) authorize /opt/stack/nova/nova/policy.py:203}} [ 2549.776831] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:53:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:53:15Z,direct_url=,disk_format='vmdk',id=5ac2bac3-6c5c-4005-b6b0-349a1330d017,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='504bd5d983344574a09ccb5d9ee0ab47',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:53:16Z,virtual_size=,visibility=), allow threads: False {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2549.776831] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Flavor limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2549.776831] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Image limits 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2549.778062] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Flavor pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2549.778062] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Image pref 0:0:0 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2549.778062] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67977) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2549.778062] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2549.778062] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2549.778339] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Got 1 possible topologies {{(pid=67977) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2549.778339] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2549.778339] env[67977]: DEBUG nova.virt.hardware [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67977) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2549.778938] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39390659-ea25-45e0-b7b6-7fad24d37ff9 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2549.787212] env[67977]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-142e5fb1-fac1-4a88-9046-231233a778d4 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2549.807562] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2550.101341] env[67977]: DEBUG nova.network.neutron [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Successfully created port: e3462720-c454-4032-b9c7-c962a09aeba9 {{(pid=67977) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2550.774920] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2550.816527] env[67977]: DEBUG nova.compute.manager [req-0040bc09-de7b-49a4-985f-0b7a9bc81e01 req-61644bec-f075-4ca8-89d5-329845798b3d service nova] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Received event network-vif-plugged-e3462720-c454-4032-b9c7-c962a09aeba9 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2550.816809] env[67977]: DEBUG oslo_concurrency.lockutils [req-0040bc09-de7b-49a4-985f-0b7a9bc81e01 req-61644bec-f075-4ca8-89d5-329845798b3d service nova] Acquiring lock "e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2550.817037] env[67977]: DEBUG oslo_concurrency.lockutils [req-0040bc09-de7b-49a4-985f-0b7a9bc81e01 req-61644bec-f075-4ca8-89d5-329845798b3d service nova] Lock "e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2550.817316] env[67977]: DEBUG oslo_concurrency.lockutils [req-0040bc09-de7b-49a4-985f-0b7a9bc81e01 req-61644bec-f075-4ca8-89d5-329845798b3d service nova] Lock "e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67977) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2550.817495] env[67977]: DEBUG nova.compute.manager [req-0040bc09-de7b-49a4-985f-0b7a9bc81e01 req-61644bec-f075-4ca8-89d5-329845798b3d service nova] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] No waiting events found dispatching network-vif-plugged-e3462720-c454-4032-b9c7-c962a09aeba9 {{(pid=67977) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2550.817658] env[67977]: WARNING nova.compute.manager [req-0040bc09-de7b-49a4-985f-0b7a9bc81e01 req-61644bec-f075-4ca8-89d5-329845798b3d service nova] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Received unexpected event network-vif-plugged-e3462720-c454-4032-b9c7-c962a09aeba9 for instance with vm_state building and task_state spawning. [ 2550.864152] env[67977]: DEBUG nova.network.neutron [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Successfully updated port: e3462720-c454-4032-b9c7-c962a09aeba9 {{(pid=67977) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2550.877771] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Acquiring lock "refresh_cache-e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2550.877935] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Acquired lock "refresh_cache-e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2550.878108] env[67977]: DEBUG nova.network.neutron [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Building network info cache for instance {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2550.938987] env[67977]: DEBUG nova.network.neutron [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Instance cache missing network info. {{(pid=67977) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2551.112393] env[67977]: DEBUG nova.network.neutron [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Updating instance_info_cache with network_info: [{"id": "e3462720-c454-4032-b9c7-c962a09aeba9", "address": "fa:16:3e:36:d2:09", "network": {"id": "fc916a90-83ba-421b-a99d-da90e80b67a8", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-666819918-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "35a592e9754f4185ae803605651eda8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3462720-c4", "ovs_interfaceid": "e3462720-c454-4032-b9c7-c962a09aeba9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2551.124677] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Releasing lock "refresh_cache-e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2551.125080] env[67977]: DEBUG nova.compute.manager [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Instance network_info: |[{"id": "e3462720-c454-4032-b9c7-c962a09aeba9", "address": "fa:16:3e:36:d2:09", "network": {"id": "fc916a90-83ba-421b-a99d-da90e80b67a8", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-666819918-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "35a592e9754f4185ae803605651eda8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3462720-c4", "ovs_interfaceid": "e3462720-c454-4032-b9c7-c962a09aeba9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67977) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2551.125472] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:36:d2:09', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e3462720-c454-4032-b9c7-c962a09aeba9', 'vif_model': 'vmxnet3'}] {{(pid=67977) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2551.132885] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Creating folder: Project (35a592e9754f4185ae803605651eda8a). Parent ref: group-v693022. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2551.133407] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7046bbe5-d93c-42d3-b833-df6aaa7143b2 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2551.144248] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Created folder: Project (35a592e9754f4185ae803605651eda8a) in parent group-v693022. [ 2551.144450] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Creating folder: Instances. Parent ref: group-v693130. {{(pid=67977) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2551.144799] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ea78d3d1-903d-40e7-ab27-a8ec175d163e {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2551.153741] env[67977]: INFO nova.virt.vmwareapi.vm_util [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Created folder: Instances in parent group-v693130. [ 2551.153959] env[67977]: DEBUG oslo.service.loopingcall [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67977) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2551.154150] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Creating VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2551.154334] env[67977]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9ed81d85-659c-489e-bc2d-2df10bd6490b {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2551.172516] env[67977]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2551.172516] env[67977]: value = "task-3468327" [ 2551.172516] env[67977]: _type = "Task" [ 2551.172516] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2551.179791] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468327, 'name': CreateVM_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2551.684054] env[67977]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468327, 'name': CreateVM_Task, 'duration_secs': 0.309216} completed successfully. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2551.684185] env[67977]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Created VM on the ESX host {{(pid=67977) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2551.684853] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2551.685034] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2551.685483] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2551.685745] env[67977]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c988b6db-f120-4763-a454-708b347fe771 {{(pid=67977) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2551.690340] env[67977]: DEBUG oslo_vmware.api [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Waiting for the task: (returnval){ [ 2551.690340] env[67977]: value = "session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ebaee3-a002-2632-e7d5-ef99b4129245" [ 2551.690340] env[67977]: _type = "Task" [ 2551.690340] env[67977]: } to complete. {{(pid=67977) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2551.697505] env[67977]: DEBUG oslo_vmware.api [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Task: {'id': session[52f3ab71-d6dd-bfe8-2432-cd66c5c42f19]52ebaee3-a002-2632-e7d5-ef99b4129245, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67977) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2552.200621] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2552.200918] env[67977]: DEBUG nova.virt.vmwareapi.vmops [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Processing image 5ac2bac3-6c5c-4005-b6b0-349a1330d017 {{(pid=67977) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2552.201112] env[67977]: DEBUG oslo_concurrency.lockutils [None req-611b1b49-6990-4a2c-89a2-98cacb78e264 tempest-ServerRescueTestJSONUnderV235-837813390 tempest-ServerRescueTestJSONUnderV235-837813390-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5ac2bac3-6c5c-4005-b6b0-349a1330d017/5ac2bac3-6c5c-4005-b6b0-349a1330d017.vmdk" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2552.775440] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2552.841653] env[67977]: DEBUG nova.compute.manager [req-b907425d-28c1-4668-a86e-fe7409a1374d req-cd07b8a9-239f-4dab-8c86-cad8313958e6 service nova] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Received event network-changed-e3462720-c454-4032-b9c7-c962a09aeba9 {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2552.841854] env[67977]: DEBUG nova.compute.manager [req-b907425d-28c1-4668-a86e-fe7409a1374d req-cd07b8a9-239f-4dab-8c86-cad8313958e6 service nova] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Refreshing instance network info cache due to event network-changed-e3462720-c454-4032-b9c7-c962a09aeba9. {{(pid=67977) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2552.842084] env[67977]: DEBUG oslo_concurrency.lockutils [req-b907425d-28c1-4668-a86e-fe7409a1374d req-cd07b8a9-239f-4dab-8c86-cad8313958e6 service nova] Acquiring lock "refresh_cache-e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2552.842233] env[67977]: DEBUG oslo_concurrency.lockutils [req-b907425d-28c1-4668-a86e-fe7409a1374d req-cd07b8a9-239f-4dab-8c86-cad8313958e6 service nova] Acquired lock "refresh_cache-e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2552.842394] env[67977]: DEBUG nova.network.neutron [req-b907425d-28c1-4668-a86e-fe7409a1374d req-cd07b8a9-239f-4dab-8c86-cad8313958e6 service nova] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Refreshing network info cache for port e3462720-c454-4032-b9c7-c962a09aeba9 {{(pid=67977) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2553.073673] env[67977]: DEBUG nova.network.neutron [req-b907425d-28c1-4668-a86e-fe7409a1374d req-cd07b8a9-239f-4dab-8c86-cad8313958e6 service nova] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Updated VIF entry in instance network info cache for port e3462720-c454-4032-b9c7-c962a09aeba9. {{(pid=67977) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2553.074104] env[67977]: DEBUG nova.network.neutron [req-b907425d-28c1-4668-a86e-fe7409a1374d req-cd07b8a9-239f-4dab-8c86-cad8313958e6 service nova] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Updating instance_info_cache with network_info: [{"id": "e3462720-c454-4032-b9c7-c962a09aeba9", "address": "fa:16:3e:36:d2:09", "network": {"id": "fc916a90-83ba-421b-a99d-da90e80b67a8", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-666819918-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "35a592e9754f4185ae803605651eda8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3462720-c4", "ovs_interfaceid": "e3462720-c454-4032-b9c7-c962a09aeba9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67977) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2553.083320] env[67977]: DEBUG oslo_concurrency.lockutils [req-b907425d-28c1-4668-a86e-fe7409a1374d req-cd07b8a9-239f-4dab-8c86-cad8313958e6 service nova] Releasing lock "refresh_cache-e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1" {{(pid=67977) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2553.775060] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2553.775340] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2557.775296] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2557.779131] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67977) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2559.771217] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2559.774799] env[67977]: DEBUG oslo_service.periodic_task [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67977) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2559.774989] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Starting heal instance info cache {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2559.775144] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Rebuilding the list of instances to heal {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2559.792962] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 396fd258-dc89-4392-a487-921958012e92] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2559.793097] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 19f06190-ebc3-4089-81d4-1ac09ec46b46] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2559.793220] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: da2dc429-8c48-4aec-ba65-c2ff4a4b0fa8] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2559.793350] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: d4339fd1-c694-4349-bc12-4d915fa23079] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2559.793473] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: 53c9934a-21cb-4b10-9a3a-c0252138eb28] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2559.793595] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] [instance: e6ef2bc7-f50a-4c51-a547-ea97c38f1dd1] Skipping network cache update for instance because it is Building. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2559.793716] env[67977]: DEBUG nova.compute.manager [None req-c13a6cfc-a4b9-44e5-a00f-d167e388b908 None None] Didn't find any instances for network info cache update. {{(pid=67977) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}}