[ 520.856466] env[69598]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 521.490692] env[69648]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 522.843171] env[69648]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=69648) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 522.843551] env[69648]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=69648) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 522.843700] env[69648]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=69648) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 522.843973] env[69648]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 523.045033] env[69648]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=69648) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 523.054973] env[69648]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=69648) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 523.161929] env[69648]: INFO nova.virt.driver [None req-a9ff191e-d9e2-4ba7-a908-596857d14c88 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 523.234868] env[69648]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 523.235025] env[69648]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 523.235137] env[69648]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=69648) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 526.053373] env[69648]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-6b5d0419-cda6-43ce-bcaf-aaa25f6a295f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.070584] env[69648]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=69648) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 526.070763] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-4851af88-2e56-4c35-b9be-bcb0c0018592 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.095824] env[69648]: INFO oslo_vmware.api [-] Successfully established new session; session ID is a904c. [ 526.096016] env[69648]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.861s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 526.096530] env[69648]: INFO nova.virt.vmwareapi.driver [None req-a9ff191e-d9e2-4ba7-a908-596857d14c88 None None] VMware vCenter version: 7.0.3 [ 526.100093] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2c31e31-0acf-4e2c-9245-3331ce87596c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.122790] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf4cbf70-adb4-47bd-925c-89d9f94efd96 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.129269] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cbc6066-3c3f-41bd-ab47-350a2c9f3e25 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.136111] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2d4e503-df27-4ba2-be5e-b82e39687927 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.149267] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c968935-e713-48d3-9d56-d97dbcb06e79 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.155751] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9fb790e-8378-4a20-804e-30690ac4cbc1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.186584] env[69648]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-2f7b77bc-8434-473c-9a2f-69fac053b474 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.191901] env[69648]: DEBUG nova.virt.vmwareapi.driver [None req-a9ff191e-d9e2-4ba7-a908-596857d14c88 None None] Extension org.openstack.compute already exists. {{(pid=69648) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 526.194653] env[69648]: INFO nova.compute.provider_config [None req-a9ff191e-d9e2-4ba7-a908-596857d14c88 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 526.211708] env[69648]: DEBUG nova.context [None req-a9ff191e-d9e2-4ba7-a908-596857d14c88 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),94ed989e-7830-4dd5-9279-1149edb6ea17(cell1) {{(pid=69648) load_cells /opt/stack/nova/nova/context.py:464}} [ 526.213715] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 526.213935] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 526.214639] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 526.215065] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Acquiring lock "94ed989e-7830-4dd5-9279-1149edb6ea17" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 526.215263] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Lock "94ed989e-7830-4dd5-9279-1149edb6ea17" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 526.216273] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Lock "94ed989e-7830-4dd5-9279-1149edb6ea17" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 526.241046] env[69648]: INFO dbcounter [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Registered counter for database nova_cell0 [ 526.249390] env[69648]: INFO dbcounter [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Registered counter for database nova_cell1 [ 526.252271] env[69648]: DEBUG oslo_db.sqlalchemy.engines [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=69648) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 526.252625] env[69648]: DEBUG oslo_db.sqlalchemy.engines [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=69648) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 526.257105] env[69648]: DEBUG dbcounter [-] [69648] Writer thread running {{(pid=69648) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 526.257820] env[69648]: DEBUG dbcounter [-] [69648] Writer thread running {{(pid=69648) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 526.259883] env[69648]: ERROR nova.db.main.api [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 526.259883] env[69648]: result = function(*args, **kwargs) [ 526.259883] env[69648]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 526.259883] env[69648]: return func(*args, **kwargs) [ 526.259883] env[69648]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 526.259883] env[69648]: result = fn(*args, **kwargs) [ 526.259883] env[69648]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 526.259883] env[69648]: return f(*args, **kwargs) [ 526.259883] env[69648]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 526.259883] env[69648]: return db.service_get_minimum_version(context, binaries) [ 526.259883] env[69648]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 526.259883] env[69648]: _check_db_access() [ 526.259883] env[69648]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 526.259883] env[69648]: stacktrace = ''.join(traceback.format_stack()) [ 526.259883] env[69648]: [ 526.260907] env[69648]: ERROR nova.db.main.api [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 526.260907] env[69648]: result = function(*args, **kwargs) [ 526.260907] env[69648]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 526.260907] env[69648]: return func(*args, **kwargs) [ 526.260907] env[69648]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 526.260907] env[69648]: result = fn(*args, **kwargs) [ 526.260907] env[69648]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 526.260907] env[69648]: return f(*args, **kwargs) [ 526.260907] env[69648]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 526.260907] env[69648]: return db.service_get_minimum_version(context, binaries) [ 526.260907] env[69648]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 526.260907] env[69648]: _check_db_access() [ 526.260907] env[69648]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 526.260907] env[69648]: stacktrace = ''.join(traceback.format_stack()) [ 526.260907] env[69648]: [ 526.261342] env[69648]: WARNING nova.objects.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Failed to get minimum service version for cell 94ed989e-7830-4dd5-9279-1149edb6ea17 [ 526.261444] env[69648]: WARNING nova.objects.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 526.261858] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Acquiring lock "singleton_lock" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 526.262031] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Acquired lock "singleton_lock" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 526.262279] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Releasing lock "singleton_lock" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 526.262600] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Full set of CONF: {{(pid=69648) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 526.262749] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ******************************************************************************** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 526.262879] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] Configuration options gathered from: {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 526.263023] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 526.263221] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 526.263349] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ================================================================================ {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 526.263613] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] allow_resize_to_same_host = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.263823] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] arq_binding_timeout = 300 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.263966] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] backdoor_port = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.264115] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] backdoor_socket = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.264288] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] block_device_allocate_retries = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.264459] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] block_device_allocate_retries_interval = 3 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.264635] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cert = self.pem {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.264808] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.264979] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute_monitors = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.265163] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] config_dir = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.265337] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] config_drive_format = iso9660 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.265499] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.265659] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] config_source = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.265835] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] console_host = devstack {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.266009] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] control_exchange = nova {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.266181] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cpu_allocation_ratio = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.266343] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] daemon = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.266516] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] debug = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.266672] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] default_access_ip_network_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.266843] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] default_availability_zone = nova {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.267010] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] default_ephemeral_format = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.267177] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] default_green_pool_size = 1000 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.267422] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.267608] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] default_schedule_zone = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.267784] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] disk_allocation_ratio = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.267952] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] enable_new_services = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.268151] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] enabled_apis = ['osapi_compute'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.268318] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] enabled_ssl_apis = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.268485] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] flat_injected = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.268644] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] force_config_drive = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.268802] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] force_raw_images = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.268974] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] graceful_shutdown_timeout = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.269149] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] heal_instance_info_cache_interval = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.269365] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] host = cpu-1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.269542] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.269711] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] initial_disk_allocation_ratio = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.269875] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] initial_ram_allocation_ratio = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.270115] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.270287] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] instance_build_timeout = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.270449] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] instance_delete_interval = 300 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.270618] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] instance_format = [instance: %(uuid)s] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.270790] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] instance_name_template = instance-%08x {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.270954] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] instance_usage_audit = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.271137] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] instance_usage_audit_period = month {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.271309] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.271481] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] instances_path = /opt/stack/data/nova/instances {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.271655] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] internal_service_availability_zone = internal {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.271812] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] key = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.271973] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] live_migration_retry_count = 30 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.272154] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] log_config_append = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.272325] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.272490] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] log_dir = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.272652] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] log_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.272783] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] log_options = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.272948] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] log_rotate_interval = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.273129] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] log_rotate_interval_type = days {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.273300] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] log_rotation_type = none {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.273467] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.273581] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.273760] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.273932] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.274074] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.274239] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] long_rpc_timeout = 1800 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.274401] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] max_concurrent_builds = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.274560] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] max_concurrent_live_migrations = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.274721] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] max_concurrent_snapshots = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.274878] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] max_local_block_devices = 3 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.275046] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] max_logfile_count = 30 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.275210] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] max_logfile_size_mb = 200 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.275369] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] maximum_instance_delete_attempts = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.275569] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] metadata_listen = 0.0.0.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.275752] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] metadata_listen_port = 8775 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.275925] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] metadata_workers = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.276101] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] migrate_max_retries = -1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.276274] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] mkisofs_cmd = genisoimage {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.276485] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] my_block_storage_ip = 10.180.1.21 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.276655] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] my_ip = 10.180.1.21 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.276832] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] network_allocate_retries = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.277028] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.277207] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] osapi_compute_listen = 0.0.0.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.277376] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] osapi_compute_listen_port = 8774 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.277547] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] osapi_compute_unique_server_name_scope = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.277719] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] osapi_compute_workers = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.277879] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] password_length = 12 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.278051] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] periodic_enable = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.278217] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] periodic_fuzzy_delay = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.278388] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] pointer_model = usbtablet {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.278556] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] preallocate_images = none {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.278720] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] publish_errors = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.278850] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] pybasedir = /opt/stack/nova {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.279014] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ram_allocation_ratio = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.279181] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] rate_limit_burst = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.279351] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] rate_limit_except_level = CRITICAL {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.279511] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] rate_limit_interval = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.279698] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] reboot_timeout = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.279864] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] reclaim_instance_interval = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.280032] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] record = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.280209] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] reimage_timeout_per_gb = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.280378] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] report_interval = 120 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.280539] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] rescue_timeout = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.280702] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] reserved_host_cpus = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.280861] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] reserved_host_disk_mb = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.281030] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] reserved_host_memory_mb = 512 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.281197] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] reserved_huge_pages = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.281358] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] resize_confirm_window = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.281518] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] resize_fs_using_block_device = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.281680] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] resume_guests_state_on_host_boot = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.281853] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.282018] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] rpc_response_timeout = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.282183] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] run_external_periodic_tasks = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.282375] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] running_deleted_instance_action = reap {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.282595] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] running_deleted_instance_poll_interval = 1800 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.282798] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] running_deleted_instance_timeout = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.282988] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler_instance_sync_interval = 120 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.283181] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_down_time = 720 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.283391] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] servicegroup_driver = db {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.283591] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] shelved_offload_time = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.283740] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] shelved_poll_interval = 3600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.283964] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] shutdown_timeout = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.284222] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] source_is_ipv6 = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.284413] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ssl_only = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.284672] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.284847] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] sync_power_state_interval = 600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.285362] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] sync_power_state_pool_size = 1000 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.285362] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] syslog_log_facility = LOG_USER {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.285362] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] tempdir = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.285563] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] timeout_nbd = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.285744] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] transport_url = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.285912] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] update_resources_interval = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.286086] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] use_cow_images = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.286251] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] use_eventlog = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.286411] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] use_journal = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.286646] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] use_json = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.286919] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] use_rootwrap_daemon = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.287131] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] use_stderr = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.287290] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] use_syslog = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.287451] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vcpu_pin_set = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.287623] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plugging_is_fatal = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.287795] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plugging_timeout = 300 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.287961] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] virt_mkfs = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.288136] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] volume_usage_poll_interval = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.288297] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] watch_log_file = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.288468] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] web = /usr/share/spice-html5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 526.288661] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_concurrency.disable_process_locking = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.288970] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.289180] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.289368] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.289544] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.289720] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.289886] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.290082] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.auth_strategy = keystone {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.290255] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.compute_link_prefix = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.290435] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.290613] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.dhcp_domain = novalocal {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.290787] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.enable_instance_password = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.290954] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.glance_link_prefix = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.291136] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.291314] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.291479] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.instance_list_per_project_cells = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.291645] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.list_records_by_skipping_down_cells = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.291813] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.local_metadata_per_cell = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.291977] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.max_limit = 1000 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.292162] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.metadata_cache_expiration = 15 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.292339] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.neutron_default_tenant_id = default {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.292511] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.use_forwarded_for = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.292683] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.use_neutron_default_nets = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.292864] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.293045] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.293223] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.293400] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.293576] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.vendordata_dynamic_targets = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.293747] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.vendordata_jsonfile_path = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.293932] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.294141] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.backend = dogpile.cache.memcached {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.294312] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.backend_argument = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.294488] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.config_prefix = cache.oslo {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.294667] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.dead_timeout = 60.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.294838] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.debug_cache_backend = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.295011] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.enable_retry_client = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.295189] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.enable_socket_keepalive = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.295364] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.enabled = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.295570] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.expiration_time = 600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.295745] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.hashclient_retry_attempts = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.295920] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.hashclient_retry_delay = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.296101] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_dead_retry = 300 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.296278] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_password = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.296445] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.296640] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.296821] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_pool_maxsize = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.296986] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.297165] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_sasl_enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.297352] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.297526] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_socket_timeout = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.297702] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.memcache_username = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.297872] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.proxies = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.298090] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.retry_attempts = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.298277] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.retry_delay = 0.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.298447] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.socket_keepalive_count = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.298617] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.socket_keepalive_idle = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.298787] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.socket_keepalive_interval = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.298949] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.tls_allowed_ciphers = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.299126] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.tls_cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.299291] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.tls_certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.299456] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.tls_enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.299618] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cache.tls_keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.299796] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.auth_section = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.299973] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.auth_type = password {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.300151] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.300331] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.catalog_info = volumev3::publicURL {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.300497] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.300666] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.300833] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.cross_az_attach = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.300997] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.debug = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.301177] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.endpoint_template = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.301346] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.http_retries = 3 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.301514] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.301679] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.301851] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.os_region_name = RegionOne {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.302028] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.302196] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cinder.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.302371] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.302539] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.cpu_dedicated_set = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.302707] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.cpu_shared_set = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.302873] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.image_type_exclude_list = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.303048] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.303218] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.max_concurrent_disk_ops = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.303409] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.max_disk_devices_to_attach = -1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.303566] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.303752] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.303923] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.resource_provider_association_refresh = 300 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.304101] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.shutdown_retry_interval = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.304288] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.304470] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] conductor.workers = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.304652] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] console.allowed_origins = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.304817] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] console.ssl_ciphers = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.304991] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] console.ssl_minimum_version = default {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.305180] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] consoleauth.token_ttl = 600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.305357] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.305541] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.305718] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.305884] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.connect_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.306060] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.connect_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.306226] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.endpoint_override = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.306394] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.306586] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.306730] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.max_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.306892] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.min_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.307065] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.region_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.307230] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.service_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.307404] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.service_type = accelerator {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.307603] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.307732] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.status_code_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.307894] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.status_code_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.308067] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.308252] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.308418] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] cyborg.version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.308607] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.backend = sqlalchemy {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.308796] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.connection = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.309038] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.connection_debug = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.309250] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.connection_parameters = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.309442] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.connection_recycle_time = 3600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.309653] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.connection_trace = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.309892] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.db_inc_retry_interval = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.310107] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.db_max_retries = 20 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.310343] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.db_max_retry_interval = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.310552] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.db_retry_interval = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.310799] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.max_overflow = 50 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.311053] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.max_pool_size = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.311270] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.max_retries = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.311466] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.311635] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.mysql_wsrep_sync_wait = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.311807] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.pool_timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.311981] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.retry_interval = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.312160] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.slave_connection = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.312332] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.sqlite_synchronous = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.312500] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] database.use_db_reconnect = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.312686] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.backend = sqlalchemy {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.312867] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.connection = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.313052] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.connection_debug = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.313234] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.connection_parameters = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.313404] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.connection_recycle_time = 3600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.313646] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.connection_trace = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.313903] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.db_inc_retry_interval = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.314105] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.db_max_retries = 20 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.314279] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.db_max_retry_interval = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.314450] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.db_retry_interval = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.314628] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.max_overflow = 50 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.314795] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.max_pool_size = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.314968] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.max_retries = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.315156] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.315322] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.315515] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.pool_timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.315688] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.retry_interval = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.315856] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.slave_connection = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.316040] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] api_database.sqlite_synchronous = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.316226] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] devices.enabled_mdev_types = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.316410] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.316607] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ephemeral_storage_encryption.enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.316791] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.316965] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.api_servers = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.317146] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.317314] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.317483] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.317648] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.connect_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.317816] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.connect_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.317980] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.debug = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.318163] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.default_trusted_certificate_ids = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.318331] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.enable_certificate_validation = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.318499] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.enable_rbd_download = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.318663] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.endpoint_override = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.318832] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.318997] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.319174] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.max_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.319339] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.min_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.319520] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.num_retries = 3 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.319718] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.rbd_ceph_conf = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.319887] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.rbd_connect_timeout = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.320073] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.rbd_pool = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.320250] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.rbd_user = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.320414] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.region_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.320581] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.service_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.320756] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.service_type = image {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.320921] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.321096] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.status_code_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.321261] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.status_code_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.321426] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.321611] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.321782] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.verify_glance_signatures = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.321945] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] glance.version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.322128] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] guestfs.debug = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.322304] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.config_drive_cdrom = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.322471] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.config_drive_inject_password = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.322670] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.322845] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.enable_instance_metrics_collection = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.323024] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.enable_remotefx = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.323203] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.instances_path_share = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.323372] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.iscsi_initiator_list = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.323546] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.limit_cpu_features = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.323712] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.323878] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.324053] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.power_state_check_timeframe = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.324233] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.324406] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.324574] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.use_multipath_io = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.324741] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.volume_attach_retry_count = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.324906] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.325079] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.vswitch_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.325246] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.325416] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] mks.enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.325826] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.326030] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] image_cache.manager_interval = 2400 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.326211] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] image_cache.precache_concurrency = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.326387] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] image_cache.remove_unused_base_images = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.326584] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.326771] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.326955] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] image_cache.subdirectory_name = _base {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.327150] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.api_max_retries = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.327319] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.api_retry_interval = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.327486] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.auth_section = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.327657] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.auth_type = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.327821] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.327985] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.328165] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.328334] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.conductor_group = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.328497] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.connect_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.328687] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.connect_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.328862] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.endpoint_override = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.329040] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.329208] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.329371] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.max_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.329537] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.min_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.329709] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.peer_list = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.329871] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.region_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.330048] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.serial_console_state_timeout = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.330217] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.service_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.330392] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.service_type = baremetal {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.330559] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.330722] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.status_code_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.330884] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.status_code_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.331054] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.331241] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.331404] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ironic.version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.331591] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.331798] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] key_manager.fixed_key = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.331991] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.332175] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.barbican_api_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.332343] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.barbican_endpoint = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.332520] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.barbican_endpoint_type = public {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.332684] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.barbican_region_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.332846] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.333013] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.333190] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.333355] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.333545] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.333713] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.number_of_retries = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.333879] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.retry_delay = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.334057] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.send_service_user_token = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.334254] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.334389] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.334555] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.verify_ssl = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.334741] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican.verify_ssl_path = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.334920] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican_service_user.auth_section = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.335101] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican_service_user.auth_type = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.335268] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican_service_user.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.335431] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican_service_user.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.335625] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican_service_user.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.335797] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican_service_user.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.335960] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican_service_user.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.336142] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican_service_user.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.336305] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] barbican_service_user.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.336485] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.approle_role_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.336669] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.approle_secret_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.336832] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.336993] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.337174] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.337340] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.337501] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.337695] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.kv_mountpoint = secret {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.337876] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.kv_path = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.338057] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.kv_version = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.338225] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.namespace = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.338388] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.root_token_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.338554] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.338716] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.ssl_ca_crt_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.338875] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.339050] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.use_ssl = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.339226] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.339402] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.auth_section = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.339570] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.auth_type = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.339733] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.339895] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.340072] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.340235] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.connect_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.340398] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.connect_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.340559] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.endpoint_override = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.340742] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.340917] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.341099] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.max_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.341266] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.min_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.341428] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.region_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.341587] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.service_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.341762] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.service_type = identity {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.341928] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.342099] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.status_code_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.342264] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.status_code_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.342422] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.342605] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.342770] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] keystone.version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.342972] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.connection_uri = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.343150] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.cpu_mode = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.343322] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.cpu_model_extra_flags = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.343492] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.cpu_models = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.343667] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.cpu_power_governor_high = performance {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.343864] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.cpu_power_governor_low = powersave {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.344047] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.cpu_power_management = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.344227] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.344394] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.device_detach_attempts = 8 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.344560] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.device_detach_timeout = 20 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.344731] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.disk_cachemodes = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.344895] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.disk_prefix = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.345073] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.enabled_perf_events = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.345244] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.file_backed_memory = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.345411] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.gid_maps = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.345596] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.hw_disk_discard = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.345768] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.hw_machine_type = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.345940] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.images_rbd_ceph_conf = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.346118] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.346291] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.346469] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.images_rbd_glance_store_name = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.346673] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.images_rbd_pool = rbd {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.346850] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.images_type = default {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.347023] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.images_volume_group = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.347192] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.inject_key = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.347357] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.inject_partition = -2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.347520] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.inject_password = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.347686] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.iscsi_iface = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.347849] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.iser_use_multipath = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.348026] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_bandwidth = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.348186] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.348351] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_downtime = 500 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.348515] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.348685] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.348849] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_inbound_addr = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.349025] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.349195] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_permit_post_copy = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.349360] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_scheme = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.349553] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_timeout_action = abort {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.349744] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_tunnelled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.349910] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_uri = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.350088] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.live_migration_with_native_tls = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.350257] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.max_queues = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.350423] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.350586] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.nfs_mount_options = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.350905] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.351095] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.351270] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.num_iser_scan_tries = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.351438] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.num_memory_encrypted_guests = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.351606] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.351774] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.num_pcie_ports = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.351945] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.num_volume_scan_tries = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.352128] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.pmem_namespaces = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.352293] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.quobyte_client_cfg = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.352601] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.352792] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rbd_connect_timeout = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.352965] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.353156] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.353326] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rbd_secret_uuid = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.353489] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rbd_user = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.353657] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.353834] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.remote_filesystem_transport = ssh {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.353997] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rescue_image_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.354175] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rescue_kernel_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.354341] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rescue_ramdisk_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.354515] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.354681] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.rx_queue_size = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.354853] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.smbfs_mount_options = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.355147] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.355325] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.snapshot_compression = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.355521] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.snapshot_image_format = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.355766] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.355946] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.sparse_logical_volumes = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.356130] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.swtpm_enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.356312] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.swtpm_group = tss {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.356488] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.swtpm_user = tss {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.356666] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.sysinfo_serial = unique {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.356827] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.tb_cache_size = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.356991] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.tx_queue_size = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.357177] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.uid_maps = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.357347] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.use_virtio_for_bridges = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.357522] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.virt_type = kvm {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.357699] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.volume_clear = zero {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.357870] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.volume_clear_size = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.358056] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.volume_use_multipath = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.358225] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.vzstorage_cache_path = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.358401] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.358574] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.vzstorage_mount_group = qemu {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.358773] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.vzstorage_mount_opts = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.358952] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.359248] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.359435] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.vzstorage_mount_user = stack {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.359606] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.359784] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.auth_section = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.359963] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.auth_type = password {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.360138] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.360305] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.360473] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.360639] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.connect_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.360801] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.connect_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.360977] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.default_floating_pool = public {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.361158] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.endpoint_override = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.361326] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.extension_sync_interval = 600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.361494] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.http_retries = 3 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.361679] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.361857] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.362030] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.max_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.362210] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.362375] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.min_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.362549] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.ovs_bridge = br-int {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.362722] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.physnets = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.362896] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.region_name = RegionOne {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.363080] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.service_metadata_proxy = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.363247] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.service_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.363418] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.service_type = network {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.363586] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.363753] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.status_code_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.363916] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.status_code_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.364089] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.364277] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.364441] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] neutron.version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.364616] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] notifications.bdms_in_notifications = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.364861] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] notifications.default_level = INFO {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.365066] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] notifications.notification_format = unversioned {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.365245] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] notifications.notify_on_state_change = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.365428] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.365639] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] pci.alias = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.365823] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] pci.device_spec = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.365996] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] pci.report_in_placement = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.366186] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.auth_section = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.366366] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.auth_type = password {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.366539] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.366704] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.366864] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.367040] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.367207] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.connect_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.367370] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.connect_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.367533] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.default_domain_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.367714] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.default_domain_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.367898] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.domain_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.368076] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.domain_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.368242] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.endpoint_override = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.368409] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.368575] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.368737] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.max_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.368896] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.min_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.369077] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.password = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.369243] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.project_domain_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.369411] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.project_domain_name = Default {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.369580] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.project_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.369757] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.project_name = service {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.369928] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.region_name = RegionOne {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.370106] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.service_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.370280] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.service_type = placement {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.370448] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.370610] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.status_code_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.370792] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.status_code_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.370969] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.system_scope = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.371146] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.371313] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.trust_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.371475] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.user_domain_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.371652] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.user_domain_name = Default {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.371815] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.user_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.371990] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.username = placement {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.372189] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.372352] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] placement.version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.372532] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.cores = 20 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.372702] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.count_usage_from_placement = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.372876] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.373060] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.injected_file_content_bytes = 10240 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.373235] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.injected_file_path_length = 255 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.373403] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.injected_files = 5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.373573] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.instances = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.373757] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.key_pairs = 100 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.373981] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.metadata_items = 128 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.374178] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.ram = 51200 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.374351] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.recheck_quota = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.374527] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.server_group_members = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.374702] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] quota.server_groups = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.374873] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] rdp.enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.375205] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.375393] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.375592] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.375766] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.image_metadata_prefilter = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.375937] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.376119] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.max_attempts = 3 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.376290] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.max_placement_results = 1000 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.376465] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.376657] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.query_placement_for_image_type_support = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.376827] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.377024] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] scheduler.workers = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.377201] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.377377] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.377561] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.377741] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.377915] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.378113] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.378287] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.378482] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.378657] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.host_subset_size = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.378828] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.378995] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.379178] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.379350] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.isolated_hosts = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.379531] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.isolated_images = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.379719] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.379887] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.380070] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.380240] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.pci_in_placement = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.380415] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.380580] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.380753] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.380916] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.381095] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.381265] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.381430] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.track_instance_changes = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.381609] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.381784] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] metrics.required = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.381952] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] metrics.weight_multiplier = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.382133] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.382304] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] metrics.weight_setting = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.382628] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.382819] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] serial_console.enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.383014] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] serial_console.port_range = 10000:20000 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.383197] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.383372] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.383545] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] serial_console.serialproxy_port = 6083 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.383721] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.auth_section = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.383902] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.auth_type = password {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.384088] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.384256] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.384423] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.384592] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.384759] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.384955] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.send_service_user_token = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.385128] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.385293] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] service_user.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.385473] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.agent_enabled = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.385682] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.385996] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.386212] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.386388] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.html5proxy_port = 6082 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.386558] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.image_compression = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.386722] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.jpeg_compression = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.386884] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.playback_compression = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.387067] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.server_listen = 127.0.0.1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.387243] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.387407] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.streaming_mode = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.387570] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] spice.zlib_compression = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.387739] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] upgrade_levels.baseapi = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.387900] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] upgrade_levels.cert = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.388121] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] upgrade_levels.compute = auto {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.388352] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] upgrade_levels.conductor = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.388529] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] upgrade_levels.scheduler = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.388744] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vendordata_dynamic_auth.auth_section = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.388923] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vendordata_dynamic_auth.auth_type = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.389101] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vendordata_dynamic_auth.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.389275] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vendordata_dynamic_auth.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.389452] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.389615] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vendordata_dynamic_auth.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.389779] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vendordata_dynamic_auth.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.389942] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.390114] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vendordata_dynamic_auth.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.390293] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.api_retry_count = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.390456] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.ca_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.390629] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.cache_prefix = devstack-image-cache {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.390800] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.cluster_name = testcl1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.390968] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.connection_pool_size = 10 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.391142] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.console_delay_seconds = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.391313] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.datastore_regex = ^datastore.* {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.391525] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.391751] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.host_password = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.391965] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.host_port = 443 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.392165] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.host_username = administrator@vsphere.local {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.392343] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.insecure = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.392507] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.integration_bridge = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.392675] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.maximum_objects = 100 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.392837] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.pbm_default_policy = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.393008] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.pbm_enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.393178] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.pbm_wsdl_location = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.393350] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.393541] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.serial_port_proxy_uri = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.393710] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.serial_port_service_uri = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.393880] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.task_poll_interval = 0.5 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.394067] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.use_linked_clone = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.394243] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.vnc_keymap = en-us {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.394412] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.vnc_port = 5900 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.394579] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vmware.vnc_port_total = 10000 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.394782] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.auth_schemes = ['none'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.394961] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.395277] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.395501] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.395666] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.novncproxy_port = 6080 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.395849] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.server_listen = 127.0.0.1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.396036] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.396203] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.vencrypt_ca_certs = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.396364] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.vencrypt_client_cert = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.396525] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vnc.vencrypt_client_key = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.396703] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.396869] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.397049] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.397217] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.397382] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.disable_rootwrap = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.397546] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.enable_numa_live_migration = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.397709] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.397876] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.398045] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.398215] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.libvirt_disable_apic = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.398378] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.398543] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.398707] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.398871] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.399042] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.399210] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.399372] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.399534] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.399695] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.399864] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.400061] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.400237] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.client_socket_timeout = 900 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.400409] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.default_pool_size = 1000 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.400578] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.keep_alive = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.400748] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.max_header_line = 16384 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.400913] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.secure_proxy_ssl_header = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.401089] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.ssl_ca_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.401256] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.ssl_cert_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.401421] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.ssl_key_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.401586] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.tcp_keepidle = 600 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.401764] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.401929] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] zvm.ca_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.402131] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] zvm.cloud_connector_url = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.402448] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.402624] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] zvm.reachable_timeout = 300 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.402810] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.enforce_new_defaults = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.402985] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.enforce_scope = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.403204] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.policy_default_rule = default {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.403408] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.403605] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.policy_file = policy.yaml {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.403786] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.403953] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.404131] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.404294] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.404461] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.404632] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.404809] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.404987] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.connection_string = messaging:// {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.405200] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.enabled = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.405345] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.es_doc_type = notification {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.405532] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.es_scroll_size = 10000 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.405716] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.es_scroll_time = 2m {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.405886] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.filter_error_trace = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.406076] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.hmac_keys = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.406252] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.sentinel_service_name = mymaster {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.406422] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.socket_timeout = 0.1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.406587] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.trace_requests = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.406751] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler.trace_sqlalchemy = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.406934] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler_jaeger.process_tags = {} {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.407109] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler_jaeger.service_name_prefix = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.407279] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] profiler_otlp.service_name_prefix = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.407447] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] remote_debug.host = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.407609] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] remote_debug.port = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.407797] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.407963] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.408143] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.408312] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.408478] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.408644] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.408808] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.408972] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.409149] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.409313] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.409486] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.409656] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.409827] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.409997] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.410175] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.410352] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.410521] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.410690] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.410859] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.411036] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.411204] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.411373] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.411539] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.411706] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.411877] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.412060] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.ssl = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.412241] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.412413] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.412579] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.412755] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.412927] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_rabbit.ssl_version = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.413128] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.413301] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_notifications.retry = -1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.413508] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.413694] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_messaging_notifications.transport_url = **** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.413871] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.auth_section = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.414050] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.auth_type = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.414216] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.cafile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.414378] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.certfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.414545] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.collect_timing = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.414709] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.connect_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.414870] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.connect_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.415042] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.endpoint_id = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.415206] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.endpoint_override = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.415371] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.insecure = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.415552] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.keyfile = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.415725] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.max_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.415886] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.min_version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.416058] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.region_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.416220] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.service_name = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.416380] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.service_type = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.416566] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.split_loggers = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.416733] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.status_code_retries = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.416894] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.status_code_retry_delay = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.417064] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.timeout = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.417227] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.valid_interfaces = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.417385] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_limit.version = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.417551] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_reports.file_event_handler = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.417753] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.417916] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] oslo_reports.log_dir = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.418151] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.418334] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.418504] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.418700] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.418873] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.419047] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.419223] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.419384] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_ovs_privileged.group = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.419544] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.419713] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.419879] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.420047] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] vif_plug_ovs_privileged.user = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.420224] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_linux_bridge.flat_interface = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.420405] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.420580] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.420754] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.420927] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.421110] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.421279] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.421444] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.421628] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.421802] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_ovs.isolate_vif = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.421970] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.422150] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.422323] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.422492] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_ovs.ovsdb_interface = native {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.422656] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_vif_ovs.per_port_bridge = False {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.422822] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_brick.lock_path = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.422987] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.423164] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.423337] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] privsep_osbrick.capabilities = [21] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.423526] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] privsep_osbrick.group = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.423695] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] privsep_osbrick.helper_command = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.423867] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.424044] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.424213] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] privsep_osbrick.user = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.424383] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.424543] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] nova_sys_admin.group = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.424704] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] nova_sys_admin.helper_command = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.424870] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.425044] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.425208] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] nova_sys_admin.user = None {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.425339] env[69648]: DEBUG oslo_service.service [None req-ff909bbe-f908-43d5-8e78-532630064515 None None] ******************************************************************************** {{(pid=69648) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 526.425804] env[69648]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 526.436667] env[69648]: WARNING nova.virt.vmwareapi.driver [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 526.437133] env[69648]: INFO nova.virt.node [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Generated node identity d38a352b-7808-44da-8216-792e96aadc88 [ 526.437386] env[69648]: INFO nova.virt.node [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Wrote node identity d38a352b-7808-44da-8216-792e96aadc88 to /opt/stack/data/n-cpu-1/compute_id [ 526.449580] env[69648]: WARNING nova.compute.manager [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Compute nodes ['d38a352b-7808-44da-8216-792e96aadc88'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 526.482694] env[69648]: INFO nova.compute.manager [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 526.505636] env[69648]: WARNING nova.compute.manager [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 526.505891] env[69648]: DEBUG oslo_concurrency.lockutils [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 526.506144] env[69648]: DEBUG oslo_concurrency.lockutils [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 526.506315] env[69648]: DEBUG oslo_concurrency.lockutils [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 526.506475] env[69648]: DEBUG nova.compute.resource_tracker [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 526.507642] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86a75532-d255-4b96-a55c-c6bf58f2c658 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.516214] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99d50a21-4c2d-4c17-9eac-752b87f3b004 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.529637] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0033e10a-eb8b-46dd-9787-96cb1dbd493f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.535692] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30e5be0e-1758-4ac3-954f-058371c22338 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.565256] env[69648]: DEBUG nova.compute.resource_tracker [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180995MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 526.565366] env[69648]: DEBUG oslo_concurrency.lockutils [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 526.565578] env[69648]: DEBUG oslo_concurrency.lockutils [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 526.577459] env[69648]: WARNING nova.compute.resource_tracker [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] No compute node record for cpu-1:d38a352b-7808-44da-8216-792e96aadc88: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host d38a352b-7808-44da-8216-792e96aadc88 could not be found. [ 526.589390] env[69648]: INFO nova.compute.resource_tracker [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: d38a352b-7808-44da-8216-792e96aadc88 [ 526.639602] env[69648]: DEBUG nova.compute.resource_tracker [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 526.639788] env[69648]: DEBUG nova.compute.resource_tracker [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 526.750062] env[69648]: INFO nova.scheduler.client.report [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] [req-3d3f2957-0571-46bc-beae-0e10f4c67951] Created resource provider record via placement API for resource provider with UUID d38a352b-7808-44da-8216-792e96aadc88 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 526.766586] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c407ad6-fcc6-41f2-bd20-fb3791f27f73 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.773709] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-593ce53f-fc09-4d45-ad02-ba96d154c931 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.802370] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb82379a-7c50-4a61-9fbb-8d5e044ebbd1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.809110] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3565acc9-0179-444b-a869-ea72b2c4a38a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.821695] env[69648]: DEBUG nova.compute.provider_tree [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Updating inventory in ProviderTree for provider d38a352b-7808-44da-8216-792e96aadc88 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 526.860306] env[69648]: DEBUG nova.scheduler.client.report [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Updated inventory for provider d38a352b-7808-44da-8216-792e96aadc88 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 526.860549] env[69648]: DEBUG nova.compute.provider_tree [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Updating resource provider d38a352b-7808-44da-8216-792e96aadc88 generation from 0 to 1 during operation: update_inventory {{(pid=69648) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 526.860723] env[69648]: DEBUG nova.compute.provider_tree [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Updating inventory in ProviderTree for provider d38a352b-7808-44da-8216-792e96aadc88 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 526.909654] env[69648]: DEBUG nova.compute.provider_tree [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Updating resource provider d38a352b-7808-44da-8216-792e96aadc88 generation from 1 to 2 during operation: update_traits {{(pid=69648) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 526.926494] env[69648]: DEBUG nova.compute.resource_tracker [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 526.926713] env[69648]: DEBUG oslo_concurrency.lockutils [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.361s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 526.926896] env[69648]: DEBUG nova.service [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Creating RPC server for service compute {{(pid=69648) start /opt/stack/nova/nova/service.py:182}} [ 526.941069] env[69648]: DEBUG nova.service [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] Join ServiceGroup membership for this service compute {{(pid=69648) start /opt/stack/nova/nova/service.py:199}} [ 526.941364] env[69648]: DEBUG nova.servicegroup.drivers.db [None req-90b91558-f534-4bc3-88ab-bf928d05ec60 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=69648) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 536.259123] env[69648]: DEBUG dbcounter [-] [69648] Writing DB stats nova_cell1:SELECT=1 {{(pid=69648) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 536.259810] env[69648]: DEBUG dbcounter [-] [69648] Writing DB stats nova_cell0:SELECT=1 {{(pid=69648) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 571.020536] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquiring lock "de093ae4-0e4c-49e8-9beb-c61501c5c705" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.021231] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Lock "de093ae4-0e4c-49e8-9beb-c61501c5c705" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.042026] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 571.175351] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.175761] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.180712] env[69648]: INFO nova.compute.claims [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 571.369309] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50adba41-ea8b-4dc2-8b9e-7a120dcdca1f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.379136] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5cefb44-359b-4b5c-bb80-ed1e88db3169 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.422097] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f967b0d-d57c-4648-83b8-06d3f6155185 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.432445] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51f32dab-cb56-4bac-8da9-9ff4323f8e5c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.447448] env[69648]: DEBUG nova.compute.provider_tree [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 571.468621] env[69648]: DEBUG nova.scheduler.client.report [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 571.514203] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.338s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 571.514879] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 571.582579] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquiring lock "549c349e-5417-408c-acb2-93e506476e2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.582579] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Lock "549c349e-5417-408c-acb2-93e506476e2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.599089] env[69648]: DEBUG nova.compute.utils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 571.602075] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Not allocating networking since 'none' was specified. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 571.611401] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 571.623173] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 571.725871] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.725871] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.731282] env[69648]: INFO nova.compute.claims [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 571.773836] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 571.907383] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4f3afbd-7194-4264-a40d-175949f508fe {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.915641] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15458b11-7733-494d-acee-f38584ec711c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.954786] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e688ab58-f2f3-45bb-aea2-3c840158ae39 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.962956] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-816b8591-9e9e-42ca-ad09-6093855cc45d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.981463] env[69648]: DEBUG nova.compute.provider_tree [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 571.998084] env[69648]: DEBUG nova.scheduler.client.report [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 572.017817] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.294s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 572.020220] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 572.085676] env[69648]: DEBUG nova.compute.utils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 572.087027] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 572.088337] env[69648]: DEBUG nova.network.neutron [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 572.124673] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 572.255760] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 572.305254] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 572.305254] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 572.305254] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 572.305404] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 572.305404] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 572.305404] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 572.305404] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 572.305404] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 572.305626] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 572.308489] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 572.308489] env[69648]: DEBUG nova.virt.hardware [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 572.308489] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf2a2b31-59d6-4dba-80e0-6b15d3613aa9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.317939] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5993078-61ff-41ab-8dfb-4cb319163d5c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.341316] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f08c0887-6899-43eb-a898-a8f625bc5dea {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.388154] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 572.388154] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 572.388154] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 572.388916] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 572.389108] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 572.389294] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 572.390465] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 572.390708] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 572.390919] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 572.391065] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 572.391260] env[69648]: DEBUG nova.virt.hardware [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 572.393709] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53a665cc-b5de-499e-8ae5-cf0d11da3275 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.408086] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d3ab121-67f5-4cac-a719-79fceb2958f0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.431596] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Instance VIF info [] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 572.445210] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 572.445210] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1d869771-2930-4c9a-b3b7-1a21fc6446fc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.455995] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Created folder: OpenStack in parent group-v4. [ 572.456302] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Creating folder: Project (92b90d0f568f4e32a69a986a3cb0d3af). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 572.456561] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dcd1cde1-a646-4287-9618-ebeb1857ab1e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.469773] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Created folder: Project (92b90d0f568f4e32a69a986a3cb0d3af) in parent group-v692308. [ 572.469881] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Creating folder: Instances. Parent ref: group-v692309. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 572.470155] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6c0a123c-eb77-45b3-8dbc-665f0dec62a4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.479015] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Created folder: Instances in parent group-v692309. [ 572.479015] env[69648]: DEBUG oslo.service.loopingcall [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 572.479196] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 572.479424] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cf6f6919-11e8-4958-b4a0-842af05b1c02 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.500306] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 572.500306] env[69648]: value = "task-3466455" [ 572.500306] env[69648]: _type = "Task" [ 572.500306] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 572.509163] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466455, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 572.596240] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquiring lock "d2e78734-c619-43ab-bdad-bc18cc78c5e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 572.596523] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Lock "d2e78734-c619-43ab-bdad-bc18cc78c5e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 572.624818] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 572.665565] env[69648]: DEBUG nova.policy [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48d6d8fd75b24bd3889bb7024fa20971', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e91c0a40b694d37b39c865a0d835032', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 572.730765] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 572.730765] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 572.732074] env[69648]: INFO nova.compute.claims [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 572.813444] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquiring lock "54630c78-200e-4b36-8612-34f411e08821" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 572.813716] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Lock "54630c78-200e-4b36-8612-34f411e08821" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 572.832931] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 572.920479] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 572.926499] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-116adc37-8e89-44bd-866d-d4df2da534c8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.938490] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e7d3602-730c-4a12-a42b-21dd913a0da4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.978060] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7d905ee-6d7b-4a4f-8168-871856d12b24 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.985752] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5718d8e-74b8-4a2e-ab19-3b29b541649b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.000876] env[69648]: DEBUG nova.compute.provider_tree [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 573.010278] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466455, 'name': CreateVM_Task, 'duration_secs': 0.431621} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 573.012851] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 573.013643] env[69648]: DEBUG nova.scheduler.client.report [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 573.022030] env[69648]: DEBUG oslo_vmware.service [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f1748b5-5693-4f76-9c01-4d821baf4605 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.024885] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 573.025060] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 573.025742] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 573.026315] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-70f1a2d3-cae0-4a02-899e-880f657afe02 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.031540] env[69648]: DEBUG oslo_vmware.api [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Waiting for the task: (returnval){ [ 573.031540] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]521f965e-e25d-213e-9d74-941091734daa" [ 573.031540] env[69648]: _type = "Task" [ 573.031540] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 573.036188] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 573.036780] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 573.041024] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.119s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 573.041317] env[69648]: INFO nova.compute.claims [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 573.049342] env[69648]: DEBUG oslo_vmware.api [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]521f965e-e25d-213e-9d74-941091734daa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 573.079969] env[69648]: DEBUG nova.compute.utils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 573.084322] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Not allocating networking since 'none' was specified. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 573.096090] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 573.193793] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 573.225107] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 573.225107] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 573.225107] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 573.225254] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 573.225254] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 573.225785] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 573.226088] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 573.226284] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 573.226465] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 573.226629] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 573.226794] env[69648]: DEBUG nova.virt.hardware [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 573.227694] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-513694bc-edbc-4d8a-b1bf-4602a6a48c94 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.245820] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fb2d0b7-b05b-473d-8420-9d3310bab6fb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.265276] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Instance VIF info [] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 573.271827] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Creating folder: Project (22ef64df507349448b07112fde90bbd3). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 573.272535] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-966d9a00-8a22-4e0d-bb79-6dc0f571519c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.276472] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c74fd3c-28c9-4c93-8d20-a970b1f3bb93 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.293124] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91bb59f0-c6a9-4dbf-9d10-1b870ce72c1d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.296742] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Created folder: Project (22ef64df507349448b07112fde90bbd3) in parent group-v692308. [ 573.297221] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Creating folder: Instances. Parent ref: group-v692312. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 573.297221] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cbd21012-93cc-43be-9c77-e4e1e69cfa01 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.328099] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11bed886-abbb-4a6a-8ed9-10bd984cfae5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.330835] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Created folder: Instances in parent group-v692312. [ 573.332754] env[69648]: DEBUG oslo.service.loopingcall [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 573.332754] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 573.332754] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bac16b6f-c4a6-4b83-8c58-29ae4e3b4afd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.347921] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d43b73a-f7ed-407e-b7ff-298a5f65832c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.352938] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 573.352938] env[69648]: value = "task-3466458" [ 573.352938] env[69648]: _type = "Task" [ 573.352938] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 573.365805] env[69648]: DEBUG nova.compute.provider_tree [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 573.372904] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466458, 'name': CreateVM_Task} progress is 6%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 573.378091] env[69648]: DEBUG nova.scheduler.client.report [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 573.405479] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.366s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 573.406435] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 573.469185] env[69648]: DEBUG nova.compute.utils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 573.470495] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 573.470789] env[69648]: DEBUG nova.network.neutron [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 573.486594] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 573.541844] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 573.542174] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 573.542375] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 573.542517] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 573.542931] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 573.546870] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05a8434e-46e4-41fe-94df-2e9bb54dd59a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.568627] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 573.568948] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 573.569693] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-248fcd86-3724-4406-b615-997ccbc5c646 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.580980] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-192addfb-9007-46fe-958a-3d21e8a60a81 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.588576] env[69648]: DEBUG oslo_vmware.api [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Waiting for the task: (returnval){ [ 573.588576] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528467f5-d8fe-157e-1601-5bce3af5b37f" [ 573.588576] env[69648]: _type = "Task" [ 573.588576] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 573.592765] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 573.606797] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 573.607121] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Creating directory with path [datastore1] vmware_temp/0863602a-4ecc-4327-83a8-f22975dbf813/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 573.607674] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cd6f7392-f59a-4f0d-a5a9-ac360b805500 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.632672] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 573.632942] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 573.634894] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 573.635258] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 573.635258] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 573.635618] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 573.635618] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 573.635956] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 573.635956] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 573.636068] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 573.636219] env[69648]: DEBUG nova.virt.hardware [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 573.637108] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e88c7c3-3527-40f3-92ba-37e843e143c2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.644369] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Created directory with path [datastore1] vmware_temp/0863602a-4ecc-4327-83a8-f22975dbf813/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 573.644434] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Fetch image to [datastore1] vmware_temp/0863602a-4ecc-4327-83a8-f22975dbf813/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 573.644640] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/0863602a-4ecc-4327-83a8-f22975dbf813/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 573.645726] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3da3cd8b-c533-4f35-981d-ad4cdbfc3230 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.655103] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-520e80e7-14a1-4573-9cf9-def12b429b66 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.668921] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9a46749-29d0-485b-a18b-e05851c7bab7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.689733] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-940164ff-2efd-48d3-9b59-172193e512ed {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.725022] env[69648]: DEBUG nova.policy [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8e4f9c39421946d9a27b8295a8df6656', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b65c626fb0c64102a2e57310bd84adb6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 573.725500] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b0ab4eb-538b-4e63-8fd6-9fe376a6ae5e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.731682] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-706280b5-0742-4724-926c-032a110a3caa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.767261] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 573.866308] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466458, 'name': CreateVM_Task, 'duration_secs': 0.2913} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 573.866308] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 573.866308] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 573.866308] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 573.866308] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 573.868500] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2fbd626e-4ea2-4dfe-bd1e-3e4e7f764532 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.873261] env[69648]: DEBUG oslo_vmware.api [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Waiting for the task: (returnval){ [ 573.873261] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528a9505-f2b9-957c-c84e-1711a94f40b5" [ 573.873261] env[69648]: _type = "Task" [ 573.873261] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 573.879561] env[69648]: DEBUG oslo_vmware.rw_handles [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0863602a-4ecc-4327-83a8-f22975dbf813/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 573.950581] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_power_states {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 573.961439] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 573.961685] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 573.961908] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 573.962786] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquiring lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.962786] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 573.963938] env[69648]: DEBUG oslo_vmware.rw_handles [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 573.964509] env[69648]: DEBUG oslo_vmware.rw_handles [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0863602a-4ecc-4327-83a8-f22975dbf813/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 573.979174] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Getting list of instances from cluster (obj){ [ 573.979174] env[69648]: value = "domain-c8" [ 573.979174] env[69648]: _type = "ClusterComputeResource" [ 573.979174] env[69648]: } {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 573.983735] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b263dc84-120d-448c-a73d-467280cb8812 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.990946] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 573.999321] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Got total of 2 instances {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 573.999321] env[69648]: WARNING nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] While synchronizing instance power states, found 4 instances in the database and 2 instances on the hypervisor. [ 573.999321] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid de093ae4-0e4c-49e8-9beb-c61501c5c705 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 573.999321] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 549c349e-5417-408c-acb2-93e506476e2a {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 573.999321] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid d2e78734-c619-43ab-bdad-bc18cc78c5e5 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 573.999321] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 54630c78-200e-4b36-8612-34f411e08821 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 573.999582] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "de093ae4-0e4c-49e8-9beb-c61501c5c705" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.999582] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "549c349e-5417-408c-acb2-93e506476e2a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.999582] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "d2e78734-c619-43ab-bdad-bc18cc78c5e5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.999582] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "54630c78-200e-4b36-8612-34f411e08821" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.999871] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 573.999871] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Getting list of instances from cluster (obj){ [ 573.999871] env[69648]: value = "domain-c8" [ 573.999871] env[69648]: _type = "ClusterComputeResource" [ 573.999871] env[69648]: } {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 574.001231] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd2f92b0-9cf2-4df2-a8be-fa1e6e8d66bc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.014714] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Got total of 2 instances {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 574.082495] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.084170] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 574.084170] env[69648]: INFO nova.compute.claims [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 574.266208] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bd8596e-82d4-4b45-be05-edde2fbba3ca {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.275168] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b80bae1e-7ecb-4e47-a72d-dc919321a022 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.317752] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f2d0d50-8099-4b4c-b7c9-f0beabf8f1d6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.327130] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bc1abdf-7b7c-4413-83f3-72adb713958d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.345543] env[69648]: DEBUG nova.compute.provider_tree [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 574.362093] env[69648]: DEBUG nova.scheduler.client.report [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 574.387178] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 574.387178] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 574.448035] env[69648]: DEBUG nova.compute.utils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 574.449543] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 574.449543] env[69648]: DEBUG nova.network.neutron [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 574.471401] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 574.597429] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 574.635962] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 574.635962] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 574.635962] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 574.636177] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 574.636177] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 574.636269] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 574.636525] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 574.636707] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 574.636872] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 574.637064] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 574.637315] env[69648]: DEBUG nova.virt.hardware [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 574.638210] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d23c5b70-fa54-4fdc-ab7e-400484ba74bb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.652600] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81ce1cc6-db9f-4910-bc04-8238e96573da {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.793369] env[69648]: DEBUG nova.policy [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44f0de2418a945c48a554a76a781d9a6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e684ee7067b4ec9a045a2a9aae1a125', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 575.801399] env[69648]: DEBUG nova.network.neutron [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Successfully created port: 9cdb2d48-8d19-4027-ae0e-843a48fab651 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 576.011150] env[69648]: DEBUG nova.network.neutron [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Successfully created port: f70a73df-7bab-4662-8c43-ea552be93595 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 576.935642] env[69648]: DEBUG nova.network.neutron [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Successfully created port: 5a80355c-9e5f-4711-8703-ae50b6091cef {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 577.499361] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquiring lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.499592] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.518694] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 577.582962] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.583221] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.586492] env[69648]: INFO nova.compute.claims [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 577.805990] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f07f8eea-298a-415e-acd7-0c2590079006 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.815563] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9923b9bb-e40a-487e-8c8c-23f6f35fc05d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.852129] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66f8c515-86ee-44a7-88de-4a2aa4f03877 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.860648] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0f4f43b-c469-4e03-99a3-7b574de87c78 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.879759] env[69648]: DEBUG nova.compute.provider_tree [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 577.890708] env[69648]: DEBUG nova.scheduler.client.report [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 577.907714] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 577.908219] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 577.963198] env[69648]: DEBUG nova.compute.utils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 577.963958] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 577.964298] env[69648]: DEBUG nova.network.neutron [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 577.987336] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 578.078711] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 578.109759] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 578.109907] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 578.110018] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 578.110214] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 578.110362] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 578.110511] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 578.110714] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 578.110877] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 578.111067] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 578.111241] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 578.111479] env[69648]: DEBUG nova.virt.hardware [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 578.112314] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4503c0ab-273b-4fb3-baa8-e1e66d49198a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.128648] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a50dbe35-affc-4894-823b-6ffd8733ece2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.501167] env[69648]: DEBUG nova.policy [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0651ca36051448aa9df7697cf20c1ffe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c5941981f56d4cefa3683a8a350d3032', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 579.739251] env[69648]: DEBUG nova.network.neutron [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Successfully updated port: 9cdb2d48-8d19-4027-ae0e-843a48fab651 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 579.759403] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquiring lock "refresh_cache-549c349e-5417-408c-acb2-93e506476e2a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 579.759529] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquired lock "refresh_cache-549c349e-5417-408c-acb2-93e506476e2a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 579.759703] env[69648]: DEBUG nova.network.neutron [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 579.981117] env[69648]: DEBUG nova.network.neutron [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 580.322419] env[69648]: DEBUG nova.network.neutron [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Successfully updated port: f70a73df-7bab-4662-8c43-ea552be93595 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 580.335621] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquiring lock "refresh_cache-54630c78-200e-4b36-8612-34f411e08821" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 580.335621] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquired lock "refresh_cache-54630c78-200e-4b36-8612-34f411e08821" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 580.336088] env[69648]: DEBUG nova.network.neutron [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 580.425057] env[69648]: DEBUG nova.network.neutron [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 581.063332] env[69648]: DEBUG nova.network.neutron [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Successfully created port: b1b293d0-45ba-44c3-83cb-d776c37d09de {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 581.101096] env[69648]: DEBUG nova.network.neutron [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Successfully updated port: 5a80355c-9e5f-4711-8703-ae50b6091cef {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 581.118296] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquiring lock "refresh_cache-9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 581.118296] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquired lock "refresh_cache-9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 581.118296] env[69648]: DEBUG nova.network.neutron [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 581.265846] env[69648]: DEBUG nova.network.neutron [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Updating instance_info_cache with network_info: [{"id": "9cdb2d48-8d19-4027-ae0e-843a48fab651", "address": "fa:16:3e:e8:50:81", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.117", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9cdb2d48-8d", "ovs_interfaceid": "9cdb2d48-8d19-4027-ae0e-843a48fab651", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 581.285761] env[69648]: DEBUG nova.network.neutron [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 581.288271] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Releasing lock "refresh_cache-549c349e-5417-408c-acb2-93e506476e2a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 581.291239] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Instance network_info: |[{"id": "9cdb2d48-8d19-4027-ae0e-843a48fab651", "address": "fa:16:3e:e8:50:81", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.117", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9cdb2d48-8d", "ovs_interfaceid": "9cdb2d48-8d19-4027-ae0e-843a48fab651", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 581.291385] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e8:50:81', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92233552-2c0c-416e-9bf3-bfcca8eda2dc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9cdb2d48-8d19-4027-ae0e-843a48fab651', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 581.298020] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Creating folder: Project (6e91c0a40b694d37b39c865a0d835032). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.298599] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8a9e6fb8-e4bf-48d7-9b6e-c48d7dc2ccb8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.310959] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Created folder: Project (6e91c0a40b694d37b39c865a0d835032) in parent group-v692308. [ 581.312019] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Creating folder: Instances. Parent ref: group-v692315. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.312019] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fb2be41d-d8b7-43dc-b76c-aaa78886db1a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.321270] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Created folder: Instances in parent group-v692315. [ 581.321553] env[69648]: DEBUG oslo.service.loopingcall [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 581.321793] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 581.322079] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-282bddc6-2ec8-4893-856f-d93436feb3a8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.343753] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 581.343753] env[69648]: value = "task-3466461" [ 581.343753] env[69648]: _type = "Task" [ 581.343753] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.351802] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466461, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 581.658133] env[69648]: DEBUG nova.network.neutron [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Updating instance_info_cache with network_info: [{"id": "f70a73df-7bab-4662-8c43-ea552be93595", "address": "fa:16:3e:06:cb:82", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf70a73df-7b", "ovs_interfaceid": "f70a73df-7bab-4662-8c43-ea552be93595", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 581.675718] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Releasing lock "refresh_cache-54630c78-200e-4b36-8612-34f411e08821" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 581.677912] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Instance network_info: |[{"id": "f70a73df-7bab-4662-8c43-ea552be93595", "address": "fa:16:3e:06:cb:82", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf70a73df-7b", "ovs_interfaceid": "f70a73df-7bab-4662-8c43-ea552be93595", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 581.678113] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:06:cb:82', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92233552-2c0c-416e-9bf3-bfcca8eda2dc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f70a73df-7bab-4662-8c43-ea552be93595', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 581.684904] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Creating folder: Project (b65c626fb0c64102a2e57310bd84adb6). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.685886] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9592e0ca-2691-4c42-bab9-25798754b00c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.698999] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Created folder: Project (b65c626fb0c64102a2e57310bd84adb6) in parent group-v692308. [ 581.699256] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Creating folder: Instances. Parent ref: group-v692318. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.699517] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-52add982-a92b-492c-962f-ac8799422e43 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.709218] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Created folder: Instances in parent group-v692318. [ 581.709218] env[69648]: DEBUG oslo.service.loopingcall [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 581.709560] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 54630c78-200e-4b36-8612-34f411e08821] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 581.709635] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-614f6c24-372c-430e-8622-7ed185534274 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.737552] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 581.737552] env[69648]: value = "task-3466464" [ 581.737552] env[69648]: _type = "Task" [ 581.737552] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.745863] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466464, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 581.760254] env[69648]: DEBUG nova.compute.manager [req-66a48157-37ff-4d2e-9ecd-e098e257ecc4 req-08b2166b-f53f-4cc6-a5dd-3b2c1f2dad33 service nova] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Received event network-vif-plugged-9cdb2d48-8d19-4027-ae0e-843a48fab651 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 581.760254] env[69648]: DEBUG oslo_concurrency.lockutils [req-66a48157-37ff-4d2e-9ecd-e098e257ecc4 req-08b2166b-f53f-4cc6-a5dd-3b2c1f2dad33 service nova] Acquiring lock "549c349e-5417-408c-acb2-93e506476e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 581.760254] env[69648]: DEBUG oslo_concurrency.lockutils [req-66a48157-37ff-4d2e-9ecd-e098e257ecc4 req-08b2166b-f53f-4cc6-a5dd-3b2c1f2dad33 service nova] Lock "549c349e-5417-408c-acb2-93e506476e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 581.760254] env[69648]: DEBUG oslo_concurrency.lockutils [req-66a48157-37ff-4d2e-9ecd-e098e257ecc4 req-08b2166b-f53f-4cc6-a5dd-3b2c1f2dad33 service nova] Lock "549c349e-5417-408c-acb2-93e506476e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 581.760418] env[69648]: DEBUG nova.compute.manager [req-66a48157-37ff-4d2e-9ecd-e098e257ecc4 req-08b2166b-f53f-4cc6-a5dd-3b2c1f2dad33 service nova] [instance: 549c349e-5417-408c-acb2-93e506476e2a] No waiting events found dispatching network-vif-plugged-9cdb2d48-8d19-4027-ae0e-843a48fab651 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 581.760418] env[69648]: WARNING nova.compute.manager [req-66a48157-37ff-4d2e-9ecd-e098e257ecc4 req-08b2166b-f53f-4cc6-a5dd-3b2c1f2dad33 service nova] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Received unexpected event network-vif-plugged-9cdb2d48-8d19-4027-ae0e-843a48fab651 for instance with vm_state building and task_state spawning. [ 581.855269] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466461, 'name': CreateVM_Task, 'duration_secs': 0.293579} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 581.855796] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 581.880816] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 581.880816] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 581.881125] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 581.881448] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-438310c9-2e2d-43cc-ba4d-bbdf385c4ea1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.888281] env[69648]: DEBUG oslo_vmware.api [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Waiting for the task: (returnval){ [ 581.888281] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528fe823-9532-5dc0-2e43-7133c6f0df49" [ 581.888281] env[69648]: _type = "Task" [ 581.888281] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.898992] env[69648]: DEBUG oslo_vmware.api [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528fe823-9532-5dc0-2e43-7133c6f0df49, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 582.046418] env[69648]: DEBUG nova.compute.manager [req-6793bbfd-35bc-4243-bb21-a35d5824fdc8 req-366fa73e-69d7-4b23-958b-157877ccef57 service nova] [instance: 54630c78-200e-4b36-8612-34f411e08821] Received event network-vif-plugged-f70a73df-7bab-4662-8c43-ea552be93595 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 582.049016] env[69648]: DEBUG oslo_concurrency.lockutils [req-6793bbfd-35bc-4243-bb21-a35d5824fdc8 req-366fa73e-69d7-4b23-958b-157877ccef57 service nova] Acquiring lock "54630c78-200e-4b36-8612-34f411e08821-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.049016] env[69648]: DEBUG oslo_concurrency.lockutils [req-6793bbfd-35bc-4243-bb21-a35d5824fdc8 req-366fa73e-69d7-4b23-958b-157877ccef57 service nova] Lock "54630c78-200e-4b36-8612-34f411e08821-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.049016] env[69648]: DEBUG oslo_concurrency.lockutils [req-6793bbfd-35bc-4243-bb21-a35d5824fdc8 req-366fa73e-69d7-4b23-958b-157877ccef57 service nova] Lock "54630c78-200e-4b36-8612-34f411e08821-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 582.049016] env[69648]: DEBUG nova.compute.manager [req-6793bbfd-35bc-4243-bb21-a35d5824fdc8 req-366fa73e-69d7-4b23-958b-157877ccef57 service nova] [instance: 54630c78-200e-4b36-8612-34f411e08821] No waiting events found dispatching network-vif-plugged-f70a73df-7bab-4662-8c43-ea552be93595 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 582.049496] env[69648]: WARNING nova.compute.manager [req-6793bbfd-35bc-4243-bb21-a35d5824fdc8 req-366fa73e-69d7-4b23-958b-157877ccef57 service nova] [instance: 54630c78-200e-4b36-8612-34f411e08821] Received unexpected event network-vif-plugged-f70a73df-7bab-4662-8c43-ea552be93595 for instance with vm_state building and task_state spawning. [ 582.251089] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466464, 'name': CreateVM_Task, 'duration_secs': 0.298881} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 582.251361] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 54630c78-200e-4b36-8612-34f411e08821] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 582.252040] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.360284] env[69648]: DEBUG nova.network.neutron [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Updating instance_info_cache with network_info: [{"id": "5a80355c-9e5f-4711-8703-ae50b6091cef", "address": "fa:16:3e:29:1b:e3", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5a80355c-9e", "ovs_interfaceid": "5a80355c-9e5f-4711-8703-ae50b6091cef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 582.374913] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Releasing lock "refresh_cache-9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 582.375657] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Instance network_info: |[{"id": "5a80355c-9e5f-4711-8703-ae50b6091cef", "address": "fa:16:3e:29:1b:e3", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5a80355c-9e", "ovs_interfaceid": "5a80355c-9e5f-4711-8703-ae50b6091cef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 582.376754] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:29:1b:e3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92233552-2c0c-416e-9bf3-bfcca8eda2dc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5a80355c-9e5f-4711-8703-ae50b6091cef', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 582.385868] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Creating folder: Project (7e684ee7067b4ec9a045a2a9aae1a125). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 582.386675] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3bb88b54-162f-4dab-bc39-77eb7fc8c637 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.402105] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Created folder: Project (7e684ee7067b4ec9a045a2a9aae1a125) in parent group-v692308. [ 582.402105] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Creating folder: Instances. Parent ref: group-v692321. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 582.403425] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2ab14a1c-92be-46e0-a18a-876bef535ff5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.409737] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 582.409737] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 582.410249] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.410332] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 582.410611] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 582.410852] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9ec4f696-8601-4046-9340-da81ead7668b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.416798] env[69648]: DEBUG oslo_vmware.api [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Waiting for the task: (returnval){ [ 582.416798] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52a22bb4-d061-ee04-a604-94760e03a857" [ 582.416798] env[69648]: _type = "Task" [ 582.416798] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 582.421020] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Created folder: Instances in parent group-v692321. [ 582.422771] env[69648]: DEBUG oslo.service.loopingcall [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 582.422771] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 582.422771] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6723fa0e-a7da-42c7-b2a9-aaf3cf372282 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.442826] env[69648]: DEBUG oslo_vmware.api [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52a22bb4-d061-ee04-a604-94760e03a857, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 582.448632] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 582.448632] env[69648]: value = "task-3466467" [ 582.448632] env[69648]: _type = "Task" [ 582.448632] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 582.458973] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466467, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 582.934125] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 582.934125] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 582.934125] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.962087] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466467, 'name': CreateVM_Task, 'duration_secs': 0.313605} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 582.962087] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 582.963948] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.963948] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 582.963948] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 582.963948] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a39e3ad8-8542-4aa2-8eef-d0d91bd00c28 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.968534] env[69648]: DEBUG oslo_vmware.api [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Waiting for the task: (returnval){ [ 582.968534] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]520a9d89-db5f-a353-b695-ba0a136b87d1" [ 582.968534] env[69648]: _type = "Task" [ 582.968534] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 582.977978] env[69648]: DEBUG oslo_vmware.api [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]520a9d89-db5f-a353-b695-ba0a136b87d1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 583.104060] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 583.104060] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 583.104060] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 583.104060] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 583.124553] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 583.124720] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 583.124857] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 583.124989] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 54630c78-200e-4b36-8612-34f411e08821] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 583.125137] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 583.125264] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 583.125447] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 583.125941] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 583.126210] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 583.126451] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 583.126604] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 583.126810] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 583.127208] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 583.127292] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 583.127453] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 583.147937] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.148261] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.148623] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 583.148623] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 583.152470] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73e8ef6b-3c14-4424-a4b3-7cb0a24dc069 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.162261] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f4aaa1c-3daa-4fcb-b53b-29645033e575 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.183266] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f47d7e0-8e93-4d85-abec-1f3ae7ef50d0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.189252] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e9e3c3b-9857-4d29-b1b9-8bcbb6bc39e9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.229469] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180995MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 583.229679] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.229913] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.312317] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance de093ae4-0e4c-49e8-9beb-c61501c5c705 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 583.313017] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 549c349e-5417-408c-acb2-93e506476e2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 583.313017] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d2e78734-c619-43ab-bdad-bc18cc78c5e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 583.313017] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 54630c78-200e-4b36-8612-34f411e08821 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 583.313017] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 583.313323] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ce04d2df-8587-4cda-93b1-cad7ba3ff670 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 583.313323] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 583.313388] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 583.426494] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d7a4552-395a-49cf-bf23-861882e6c2a8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.434693] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6102858-c569-4266-905a-a6ee6c699279 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.466043] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af85ca8d-8914-46d3-b410-6710a1144e63 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.479614] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a0bcc54-7e03-45f0-8a5b-0fdb0036d060 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.483910] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 583.484055] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 583.484298] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 583.494454] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 583.505826] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 583.524422] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 583.524527] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.295s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 584.829531] env[69648]: DEBUG nova.network.neutron [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Successfully updated port: b1b293d0-45ba-44c3-83cb-d776c37d09de {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 584.849483] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquiring lock "refresh_cache-ce04d2df-8587-4cda-93b1-cad7ba3ff670" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 584.849638] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquired lock "refresh_cache-ce04d2df-8587-4cda-93b1-cad7ba3ff670" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 584.849789] env[69648]: DEBUG nova.network.neutron [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 585.028070] env[69648]: DEBUG nova.network.neutron [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 585.477152] env[69648]: DEBUG nova.compute.manager [req-49f416fd-8acd-4312-802b-cb8362f8fad6 req-808dc00a-23ee-44b8-9194-6d2b209c769f service nova] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Received event network-changed-9cdb2d48-8d19-4027-ae0e-843a48fab651 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 585.477282] env[69648]: DEBUG nova.compute.manager [req-49f416fd-8acd-4312-802b-cb8362f8fad6 req-808dc00a-23ee-44b8-9194-6d2b209c769f service nova] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Refreshing instance network info cache due to event network-changed-9cdb2d48-8d19-4027-ae0e-843a48fab651. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 585.477531] env[69648]: DEBUG oslo_concurrency.lockutils [req-49f416fd-8acd-4312-802b-cb8362f8fad6 req-808dc00a-23ee-44b8-9194-6d2b209c769f service nova] Acquiring lock "refresh_cache-549c349e-5417-408c-acb2-93e506476e2a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.477702] env[69648]: DEBUG oslo_concurrency.lockutils [req-49f416fd-8acd-4312-802b-cb8362f8fad6 req-808dc00a-23ee-44b8-9194-6d2b209c769f service nova] Acquired lock "refresh_cache-549c349e-5417-408c-acb2-93e506476e2a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.477865] env[69648]: DEBUG nova.network.neutron [req-49f416fd-8acd-4312-802b-cb8362f8fad6 req-808dc00a-23ee-44b8-9194-6d2b209c769f service nova] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Refreshing network info cache for port 9cdb2d48-8d19-4027-ae0e-843a48fab651 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 585.507175] env[69648]: DEBUG nova.compute.manager [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Received event network-vif-plugged-5a80355c-9e5f-4711-8703-ae50b6091cef {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 585.507723] env[69648]: DEBUG oslo_concurrency.lockutils [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] Acquiring lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.508070] env[69648]: DEBUG oslo_concurrency.lockutils [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] Lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.508142] env[69648]: DEBUG oslo_concurrency.lockutils [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] Lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 585.508475] env[69648]: DEBUG nova.compute.manager [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] No waiting events found dispatching network-vif-plugged-5a80355c-9e5f-4711-8703-ae50b6091cef {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 585.508475] env[69648]: WARNING nova.compute.manager [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Received unexpected event network-vif-plugged-5a80355c-9e5f-4711-8703-ae50b6091cef for instance with vm_state building and task_state spawning. [ 585.508691] env[69648]: DEBUG nova.compute.manager [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 54630c78-200e-4b36-8612-34f411e08821] Received event network-changed-f70a73df-7bab-4662-8c43-ea552be93595 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 585.509654] env[69648]: DEBUG nova.compute.manager [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 54630c78-200e-4b36-8612-34f411e08821] Refreshing instance network info cache due to event network-changed-f70a73df-7bab-4662-8c43-ea552be93595. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 585.509654] env[69648]: DEBUG oslo_concurrency.lockutils [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] Acquiring lock "refresh_cache-54630c78-200e-4b36-8612-34f411e08821" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.509654] env[69648]: DEBUG oslo_concurrency.lockutils [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] Acquired lock "refresh_cache-54630c78-200e-4b36-8612-34f411e08821" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.509753] env[69648]: DEBUG nova.network.neutron [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 54630c78-200e-4b36-8612-34f411e08821] Refreshing network info cache for port f70a73df-7bab-4662-8c43-ea552be93595 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 585.845477] env[69648]: DEBUG nova.network.neutron [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Updating instance_info_cache with network_info: [{"id": "b1b293d0-45ba-44c3-83cb-d776c37d09de", "address": "fa:16:3e:82:26:64", "network": {"id": "59475999-907e-40ca-afd5-2a03b1301111", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1952049877-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c5941981f56d4cefa3683a8a350d3032", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d0c6fd7-3cc9-4818-9475-8f15900394cc", "external-id": "nsx-vlan-transportzone-317", "segmentation_id": 317, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb1b293d0-45", "ovs_interfaceid": "b1b293d0-45ba-44c3-83cb-d776c37d09de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 585.874402] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Releasing lock "refresh_cache-ce04d2df-8587-4cda-93b1-cad7ba3ff670" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 585.874787] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Instance network_info: |[{"id": "b1b293d0-45ba-44c3-83cb-d776c37d09de", "address": "fa:16:3e:82:26:64", "network": {"id": "59475999-907e-40ca-afd5-2a03b1301111", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1952049877-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c5941981f56d4cefa3683a8a350d3032", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d0c6fd7-3cc9-4818-9475-8f15900394cc", "external-id": "nsx-vlan-transportzone-317", "segmentation_id": 317, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb1b293d0-45", "ovs_interfaceid": "b1b293d0-45ba-44c3-83cb-d776c37d09de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 585.875460] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:82:26:64', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d0c6fd7-3cc9-4818-9475-8f15900394cc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b1b293d0-45ba-44c3-83cb-d776c37d09de', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 585.901271] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Creating folder: Project (c5941981f56d4cefa3683a8a350d3032). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 585.902086] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bc285d9a-81dc-4841-95fa-84260cb6d9dd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.919119] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Created folder: Project (c5941981f56d4cefa3683a8a350d3032) in parent group-v692308. [ 585.919329] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Creating folder: Instances. Parent ref: group-v692324. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 585.919578] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6e67e827-3a4e-4499-99a4-5e636cc87423 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.930710] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Created folder: Instances in parent group-v692324. [ 585.931071] env[69648]: DEBUG oslo.service.loopingcall [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 585.931799] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 585.933039] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f7a407ce-5ac3-46cf-bd70-7860fe213ba8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.954998] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 585.954998] env[69648]: value = "task-3466470" [ 585.954998] env[69648]: _type = "Task" [ 585.954998] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 585.963725] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466470, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 586.466669] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466470, 'name': CreateVM_Task, 'duration_secs': 0.326317} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 586.468124] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 586.469550] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 586.469550] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 586.469550] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 586.472406] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c5f916c0-bc9a-47b5-909b-bb94c39d7f0f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.477212] env[69648]: DEBUG oslo_vmware.api [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Waiting for the task: (returnval){ [ 586.477212] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c5db9b-142b-690b-baf1-51c6e7c73b3f" [ 586.477212] env[69648]: _type = "Task" [ 586.477212] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 586.492622] env[69648]: DEBUG oslo_vmware.api [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c5db9b-142b-690b-baf1-51c6e7c73b3f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 586.747864] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquiring lock "dfbb396b-8f18-456d-9064-be451cdd1ac9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 586.748804] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "dfbb396b-8f18-456d-9064-be451cdd1ac9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 586.767720] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 586.849423] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 586.849696] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 586.851355] env[69648]: INFO nova.compute.claims [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 586.996408] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 586.996704] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 586.997431] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 587.094231] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddeed71a-6ac7-4c29-a10f-6d4eb3265a69 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 587.103652] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f737b4ac-00b6-4229-92c6-cc57d1a03d68 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 587.139278] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b566bf1-f36f-4961-accd-26f5a4b23615 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 587.147564] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b1a4307-d59e-47aa-a736-055c97a2441c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 587.167794] env[69648]: DEBUG nova.compute.provider_tree [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 587.181758] env[69648]: DEBUG nova.scheduler.client.report [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 587.206656] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.354s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 587.206656] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 587.263800] env[69648]: DEBUG nova.compute.utils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 587.266557] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 587.267080] env[69648]: DEBUG nova.network.neutron [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 587.281129] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 587.331173] env[69648]: DEBUG nova.network.neutron [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 54630c78-200e-4b36-8612-34f411e08821] Updated VIF entry in instance network info cache for port f70a73df-7bab-4662-8c43-ea552be93595. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 587.333035] env[69648]: DEBUG nova.network.neutron [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 54630c78-200e-4b36-8612-34f411e08821] Updating instance_info_cache with network_info: [{"id": "f70a73df-7bab-4662-8c43-ea552be93595", "address": "fa:16:3e:06:cb:82", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf70a73df-7b", "ovs_interfaceid": "f70a73df-7bab-4662-8c43-ea552be93595", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 587.351771] env[69648]: DEBUG oslo_concurrency.lockutils [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] Releasing lock "refresh_cache-54630c78-200e-4b36-8612-34f411e08821" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 587.352033] env[69648]: DEBUG nova.compute.manager [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Received event network-changed-5a80355c-9e5f-4711-8703-ae50b6091cef {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 587.352205] env[69648]: DEBUG nova.compute.manager [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Refreshing instance network info cache due to event network-changed-5a80355c-9e5f-4711-8703-ae50b6091cef. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 587.352725] env[69648]: DEBUG oslo_concurrency.lockutils [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] Acquiring lock "refresh_cache-9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 587.352725] env[69648]: DEBUG oslo_concurrency.lockutils [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] Acquired lock "refresh_cache-9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 587.352725] env[69648]: DEBUG nova.network.neutron [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Refreshing network info cache for port 5a80355c-9e5f-4711-8703-ae50b6091cef {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 587.397318] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 587.468120] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 587.468120] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 587.468120] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 587.468315] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 587.468315] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 587.468315] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 587.468315] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 587.470687] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 587.471123] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 587.474094] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 587.474094] env[69648]: DEBUG nova.virt.hardware [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 587.474094] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e73f9777-ad99-4c09-9b3b-d32ec4cb07d4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 587.483020] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd8ed9c3-873b-4a83-b308-ca0c29d6087b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 587.605147] env[69648]: DEBUG nova.policy [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '03af9038ddaf4978a05379b1c973c744', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab8e75d74f024404b699476682537d40', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 587.744850] env[69648]: DEBUG nova.network.neutron [req-49f416fd-8acd-4312-802b-cb8362f8fad6 req-808dc00a-23ee-44b8-9194-6d2b209c769f service nova] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Updated VIF entry in instance network info cache for port 9cdb2d48-8d19-4027-ae0e-843a48fab651. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 587.745233] env[69648]: DEBUG nova.network.neutron [req-49f416fd-8acd-4312-802b-cb8362f8fad6 req-808dc00a-23ee-44b8-9194-6d2b209c769f service nova] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Updating instance_info_cache with network_info: [{"id": "9cdb2d48-8d19-4027-ae0e-843a48fab651", "address": "fa:16:3e:e8:50:81", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.117", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9cdb2d48-8d", "ovs_interfaceid": "9cdb2d48-8d19-4027-ae0e-843a48fab651", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 587.759330] env[69648]: DEBUG oslo_concurrency.lockutils [req-49f416fd-8acd-4312-802b-cb8362f8fad6 req-808dc00a-23ee-44b8-9194-6d2b209c769f service nova] Releasing lock "refresh_cache-549c349e-5417-408c-acb2-93e506476e2a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 589.009174] env[69648]: DEBUG nova.network.neutron [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Updated VIF entry in instance network info cache for port 5a80355c-9e5f-4711-8703-ae50b6091cef. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 589.009923] env[69648]: DEBUG nova.network.neutron [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Updating instance_info_cache with network_info: [{"id": "5a80355c-9e5f-4711-8703-ae50b6091cef", "address": "fa:16:3e:29:1b:e3", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5a80355c-9e", "ovs_interfaceid": "5a80355c-9e5f-4711-8703-ae50b6091cef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 589.024494] env[69648]: DEBUG oslo_concurrency.lockutils [req-3a7151d5-5d36-445f-ae96-948a1da1960c req-0df5bf14-d983-42c2-a012-dd5b1d9ff5e3 service nova] Releasing lock "refresh_cache-9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 589.065035] env[69648]: DEBUG nova.network.neutron [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Successfully created port: 37641368-6169-4a86-b126-4588176695fa {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 589.813027] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "20bce654-7f57-4de6-8f7a-c1b34286fc86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.813289] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "20bce654-7f57-4de6-8f7a-c1b34286fc86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.832715] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 589.923081] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.923339] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.925100] env[69648]: INFO nova.compute.claims [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 589.960033] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquiring lock "928bc799-4fed-4005-89d2-e18196f88ffb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.962527] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "928bc799-4fed-4005-89d2-e18196f88ffb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.979392] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 590.053374] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 590.206072] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c58f4e0-8c7f-42fd-90f0-5844d75771f1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.214294] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85a9aad6-d8e9-4abb-91e6-4a983574a1a4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.257057] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaba8ad6-d67c-49da-9ccf-0fae109d7b8c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.265425] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c27b966-c88d-43c6-8aa0-4cc3a7679adb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.281107] env[69648]: DEBUG nova.compute.provider_tree [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 590.293511] env[69648]: DEBUG nova.scheduler.client.report [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 590.311051] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.388s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 590.311580] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 590.316966] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.261s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 590.316966] env[69648]: INFO nova.compute.claims [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 590.364896] env[69648]: DEBUG nova.compute.utils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 590.366914] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 590.367165] env[69648]: DEBUG nova.network.neutron [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 590.380095] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 590.476333] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 590.514483] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 590.514483] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 590.514483] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 590.514679] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 590.514962] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 590.515323] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 590.515716] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 590.515982] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 590.516292] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 590.516963] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 590.517261] env[69648]: DEBUG nova.virt.hardware [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 590.518608] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c67b3c77-bde2-4470-9048-9bfb8635244b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.538383] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ca8c775-caeb-4551-8282-6e7e50afac34 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.593945] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e799b61e-7ce9-49c6-91f5-6951ae5cd65e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.603324] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f443d197-7b81-4bbe-bfc3-cc25eae35949 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.650653] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15b32d6d-d92a-4889-b245-7972d6e92ff2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.658451] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-554f3ad3-94fb-4b19-9614-8f8f139fb6de {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.671993] env[69648]: DEBUG nova.compute.provider_tree [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 590.685225] env[69648]: DEBUG nova.scheduler.client.report [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 590.732696] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.416s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 590.732696] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 590.778989] env[69648]: DEBUG nova.policy [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a2e0dc0701ca4ef48aefff30ebd1526c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a3d06fdf00fb4237b20e95cfcdee2af1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 590.785403] env[69648]: DEBUG nova.compute.utils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 590.788043] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 590.788043] env[69648]: DEBUG nova.network.neutron [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 590.799926] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 590.809264] env[69648]: DEBUG nova.compute.manager [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Received event network-vif-plugged-b1b293d0-45ba-44c3-83cb-d776c37d09de {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 590.809476] env[69648]: DEBUG oslo_concurrency.lockutils [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] Acquiring lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 590.810079] env[69648]: DEBUG oslo_concurrency.lockutils [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] Lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 590.810079] env[69648]: DEBUG oslo_concurrency.lockutils [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] Lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 590.810079] env[69648]: DEBUG nova.compute.manager [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] No waiting events found dispatching network-vif-plugged-b1b293d0-45ba-44c3-83cb-d776c37d09de {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 590.810279] env[69648]: WARNING nova.compute.manager [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Received unexpected event network-vif-plugged-b1b293d0-45ba-44c3-83cb-d776c37d09de for instance with vm_state building and task_state spawning. [ 590.810470] env[69648]: DEBUG nova.compute.manager [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Received event network-changed-b1b293d0-45ba-44c3-83cb-d776c37d09de {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 590.810538] env[69648]: DEBUG nova.compute.manager [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Refreshing instance network info cache due to event network-changed-b1b293d0-45ba-44c3-83cb-d776c37d09de. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 590.810719] env[69648]: DEBUG oslo_concurrency.lockutils [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] Acquiring lock "refresh_cache-ce04d2df-8587-4cda-93b1-cad7ba3ff670" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 590.810992] env[69648]: DEBUG oslo_concurrency.lockutils [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] Acquired lock "refresh_cache-ce04d2df-8587-4cda-93b1-cad7ba3ff670" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 590.811070] env[69648]: DEBUG nova.network.neutron [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Refreshing network info cache for port b1b293d0-45ba-44c3-83cb-d776c37d09de {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 590.892509] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 590.921661] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 590.922018] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 590.922925] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 590.922925] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 590.922925] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 590.922925] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 590.922925] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 590.923144] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 590.923144] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 590.923567] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 590.923567] env[69648]: DEBUG nova.virt.hardware [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 590.925030] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95096ac8-8573-400c-9b41-aa50aae082bf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.934302] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81a93edc-5e96-406f-8aa8-113494ce48ed {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 591.356116] env[69648]: DEBUG nova.policy [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44174b26fb4c47c897463d83787864c8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1eee2d0a960e4a56b93f41defb6ff5fe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 592.034233] env[69648]: DEBUG nova.network.neutron [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Successfully updated port: 37641368-6169-4a86-b126-4588176695fa {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 592.047792] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquiring lock "refresh_cache-dfbb396b-8f18-456d-9064-be451cdd1ac9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 592.048349] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquired lock "refresh_cache-dfbb396b-8f18-456d-9064-be451cdd1ac9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 592.048519] env[69648]: DEBUG nova.network.neutron [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 592.219030] env[69648]: DEBUG nova.network.neutron [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 592.837220] env[69648]: DEBUG nova.network.neutron [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Successfully created port: cf663f04-efb4-4a4f-910b-56a841b0b037 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 592.904164] env[69648]: DEBUG nova.network.neutron [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Updated VIF entry in instance network info cache for port b1b293d0-45ba-44c3-83cb-d776c37d09de. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 592.904575] env[69648]: DEBUG nova.network.neutron [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Updating instance_info_cache with network_info: [{"id": "b1b293d0-45ba-44c3-83cb-d776c37d09de", "address": "fa:16:3e:82:26:64", "network": {"id": "59475999-907e-40ca-afd5-2a03b1301111", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1952049877-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c5941981f56d4cefa3683a8a350d3032", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d0c6fd7-3cc9-4818-9475-8f15900394cc", "external-id": "nsx-vlan-transportzone-317", "segmentation_id": 317, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb1b293d0-45", "ovs_interfaceid": "b1b293d0-45ba-44c3-83cb-d776c37d09de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 592.915803] env[69648]: DEBUG oslo_concurrency.lockutils [req-5d0f98f3-dfb4-4a9a-b37f-76b01d164955 req-95957c47-d99b-4331-a081-7a234528d9b3 service nova] Releasing lock "refresh_cache-ce04d2df-8587-4cda-93b1-cad7ba3ff670" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 593.238309] env[69648]: DEBUG nova.network.neutron [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Successfully created port: 91d52ed1-e558-4488-a238-71f89b68ba6c {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 593.610602] env[69648]: DEBUG nova.network.neutron [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Updating instance_info_cache with network_info: [{"id": "37641368-6169-4a86-b126-4588176695fa", "address": "fa:16:3e:74:35:4c", "network": {"id": "bfe77ffc-ad8a-4def-bc6f-facdeb8e2d96", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1005180974-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab8e75d74f024404b699476682537d40", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap37641368-61", "ovs_interfaceid": "37641368-6169-4a86-b126-4588176695fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 593.627356] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Releasing lock "refresh_cache-dfbb396b-8f18-456d-9064-be451cdd1ac9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 593.629127] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Instance network_info: |[{"id": "37641368-6169-4a86-b126-4588176695fa", "address": "fa:16:3e:74:35:4c", "network": {"id": "bfe77ffc-ad8a-4def-bc6f-facdeb8e2d96", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1005180974-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab8e75d74f024404b699476682537d40", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap37641368-61", "ovs_interfaceid": "37641368-6169-4a86-b126-4588176695fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 593.630702] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:74:35:4c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '04ccbc7a-cf8d-4ea2-8411-291a1e27df7b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '37641368-6169-4a86-b126-4588176695fa', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 593.643569] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Creating folder: Project (ab8e75d74f024404b699476682537d40). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.644342] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dd7c47b8-43dd-4028-a6b5-15ff7c086b9d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.663613] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Created folder: Project (ab8e75d74f024404b699476682537d40) in parent group-v692308. [ 593.663613] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Creating folder: Instances. Parent ref: group-v692327. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.664124] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5b0ac4a1-8670-4ac3-85ff-6c3fdd5128e4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.675975] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Created folder: Instances in parent group-v692327. [ 593.676311] env[69648]: DEBUG oslo.service.loopingcall [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 593.677508] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 593.677594] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-946c6226-8041-4c1f-bb67-86f99fc896f8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.704242] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 593.704242] env[69648]: value = "task-3466473" [ 593.704242] env[69648]: _type = "Task" [ 593.704242] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 593.713141] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466473, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 594.214280] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466473, 'name': CreateVM_Task, 'duration_secs': 0.277749} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 594.215364] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 594.215364] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.215364] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 594.215709] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 594.216564] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-894e7a5d-479d-46ca-9961-9f3cf39f7b4e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.223510] env[69648]: DEBUG oslo_vmware.api [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Waiting for the task: (returnval){ [ 594.223510] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52dac0e0-fed8-1da1-5d60-ef8b32900791" [ 594.223510] env[69648]: _type = "Task" [ 594.223510] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 594.229634] env[69648]: DEBUG oslo_vmware.api [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52dac0e0-fed8-1da1-5d60-ef8b32900791, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 594.698692] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "ed63f202-c76d-4492-b738-606ee1c6b059" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 594.699448] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "ed63f202-c76d-4492-b738-606ee1c6b059" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 594.719067] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 594.734939] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 594.735227] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 594.735438] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.795300] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 594.796136] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 594.797786] env[69648]: INFO nova.compute.claims [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 595.030569] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1849ca34-09f9-4214-beb1-082442293001 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.038653] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15edf727-72f3-471a-b145-bc1177818735 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.074012] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3503e702-ec11-4080-912f-2fb5c9f76532 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.082322] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a11d1038-755c-4e2f-94a7-0af0f594323e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.098760] env[69648]: DEBUG nova.compute.provider_tree [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 595.109523] env[69648]: DEBUG nova.scheduler.client.report [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 595.131561] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 595.132061] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 595.179963] env[69648]: DEBUG nova.compute.utils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 595.182212] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 595.183246] env[69648]: DEBUG nova.network.neutron [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 595.192727] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 595.275916] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 595.309875] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 595.310295] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 595.310295] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 595.310453] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 595.310678] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 595.310741] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 595.311045] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 595.311112] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 595.311314] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 595.311721] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 595.312046] env[69648]: DEBUG nova.virt.hardware [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 595.313190] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a428855d-aec8-4f0e-88d7-30b42af5d46f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.323284] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-540c4957-18ec-42a3-99c5-601a4fcfa16b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.684482] env[69648]: DEBUG nova.policy [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a2e0dc0701ca4ef48aefff30ebd1526c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a3d06fdf00fb4237b20e95cfcdee2af1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 595.960948] env[69648]: DEBUG nova.network.neutron [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Successfully updated port: cf663f04-efb4-4a4f-910b-56a841b0b037 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 595.979928] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "refresh_cache-20bce654-7f57-4de6-8f7a-c1b34286fc86" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 595.980241] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquired lock "refresh_cache-20bce654-7f57-4de6-8f7a-c1b34286fc86" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 595.980447] env[69648]: DEBUG nova.network.neutron [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 596.276381] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 596.276661] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 596.288436] env[69648]: DEBUG nova.network.neutron [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 596.505927] env[69648]: DEBUG nova.network.neutron [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Successfully updated port: 91d52ed1-e558-4488-a238-71f89b68ba6c {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 596.518275] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquiring lock "refresh_cache-928bc799-4fed-4005-89d2-e18196f88ffb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 596.518484] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquired lock "refresh_cache-928bc799-4fed-4005-89d2-e18196f88ffb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 596.518780] env[69648]: DEBUG nova.network.neutron [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 596.666347] env[69648]: DEBUG nova.network.neutron [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 597.371826] env[69648]: DEBUG nova.network.neutron [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Updating instance_info_cache with network_info: [{"id": "cf663f04-efb4-4a4f-910b-56a841b0b037", "address": "fa:16:3e:a8:3c:9c", "network": {"id": "add7e0ae-c621-4a5b-a4fe-c7fc9a267c92", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2032585077-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a3d06fdf00fb4237b20e95cfcdee2af1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd72ef32-a57c-43b0-93df-e8a030987d44", "external-id": "nsx-vlan-transportzone-340", "segmentation_id": 340, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf663f04-ef", "ovs_interfaceid": "cf663f04-efb4-4a4f-910b-56a841b0b037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.391751] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Releasing lock "refresh_cache-20bce654-7f57-4de6-8f7a-c1b34286fc86" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 597.392424] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Instance network_info: |[{"id": "cf663f04-efb4-4a4f-910b-56a841b0b037", "address": "fa:16:3e:a8:3c:9c", "network": {"id": "add7e0ae-c621-4a5b-a4fe-c7fc9a267c92", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2032585077-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a3d06fdf00fb4237b20e95cfcdee2af1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd72ef32-a57c-43b0-93df-e8a030987d44", "external-id": "nsx-vlan-transportzone-340", "segmentation_id": 340, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf663f04-ef", "ovs_interfaceid": "cf663f04-efb4-4a4f-910b-56a841b0b037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 597.393130] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a8:3c:9c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dd72ef32-a57c-43b0-93df-e8a030987d44', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cf663f04-efb4-4a4f-910b-56a841b0b037', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 597.401262] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Creating folder: Project (a3d06fdf00fb4237b20e95cfcdee2af1). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 597.401882] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-54fc4571-3422-49b5-85e2-83c6371c53ba {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 597.417822] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Created folder: Project (a3d06fdf00fb4237b20e95cfcdee2af1) in parent group-v692308. [ 597.418906] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Creating folder: Instances. Parent ref: group-v692330. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 597.419739] env[69648]: DEBUG nova.network.neutron [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Successfully created port: ddd5fcb8-07b3-4347-b45e-098f0f293ba3 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 597.421592] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-229f25d9-7ce6-4620-95f7-a53160da3361 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 597.432661] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Created folder: Instances in parent group-v692330. [ 597.433935] env[69648]: DEBUG oslo.service.loopingcall [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 597.434140] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 597.434415] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-21800624-725e-4475-907c-106ad1d6ddcb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 597.459010] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 597.459010] env[69648]: value = "task-3466476" [ 597.459010] env[69648]: _type = "Task" [ 597.459010] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 597.470125] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466476, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 597.572080] env[69648]: DEBUG nova.network.neutron [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Updating instance_info_cache with network_info: [{"id": "91d52ed1-e558-4488-a238-71f89b68ba6c", "address": "fa:16:3e:41:14:57", "network": {"id": "d56f22c7-9161-428c-8257-7a261f52fc44", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1035741553-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1eee2d0a960e4a56b93f41defb6ff5fe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap91d52ed1-e5", "ovs_interfaceid": "91d52ed1-e558-4488-a238-71f89b68ba6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.587138] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Releasing lock "refresh_cache-928bc799-4fed-4005-89d2-e18196f88ffb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 597.587138] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Instance network_info: |[{"id": "91d52ed1-e558-4488-a238-71f89b68ba6c", "address": "fa:16:3e:41:14:57", "network": {"id": "d56f22c7-9161-428c-8257-7a261f52fc44", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1035741553-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1eee2d0a960e4a56b93f41defb6ff5fe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap91d52ed1-e5", "ovs_interfaceid": "91d52ed1-e558-4488-a238-71f89b68ba6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 597.587360] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:41:14:57', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '91d52ed1-e558-4488-a238-71f89b68ba6c', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 597.600886] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Creating folder: Project (1eee2d0a960e4a56b93f41defb6ff5fe). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 597.602372] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-67e757c9-be3d-4174-93b4-ea6ae56147fd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 597.617882] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Created folder: Project (1eee2d0a960e4a56b93f41defb6ff5fe) in parent group-v692308. [ 597.617882] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Creating folder: Instances. Parent ref: group-v692333. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 597.617882] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3eeed519-5092-4b60-8f75-de871fea84d4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 598.194963] env[69648]: DEBUG nova.compute.manager [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Received event network-vif-plugged-37641368-6169-4a86-b126-4588176695fa {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 598.195189] env[69648]: DEBUG oslo_concurrency.lockutils [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] Acquiring lock "dfbb396b-8f18-456d-9064-be451cdd1ac9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 598.195392] env[69648]: DEBUG oslo_concurrency.lockutils [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] Lock "dfbb396b-8f18-456d-9064-be451cdd1ac9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 598.195555] env[69648]: DEBUG oslo_concurrency.lockutils [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] Lock "dfbb396b-8f18-456d-9064-be451cdd1ac9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 598.195720] env[69648]: DEBUG nova.compute.manager [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] No waiting events found dispatching network-vif-plugged-37641368-6169-4a86-b126-4588176695fa {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 598.195921] env[69648]: WARNING nova.compute.manager [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Received unexpected event network-vif-plugged-37641368-6169-4a86-b126-4588176695fa for instance with vm_state building and task_state spawning. [ 598.196050] env[69648]: DEBUG nova.compute.manager [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Received event network-changed-37641368-6169-4a86-b126-4588176695fa {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 598.196203] env[69648]: DEBUG nova.compute.manager [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Refreshing instance network info cache due to event network-changed-37641368-6169-4a86-b126-4588176695fa. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 598.196433] env[69648]: DEBUG oslo_concurrency.lockutils [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] Acquiring lock "refresh_cache-dfbb396b-8f18-456d-9064-be451cdd1ac9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 598.196537] env[69648]: DEBUG oslo_concurrency.lockutils [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] Acquired lock "refresh_cache-dfbb396b-8f18-456d-9064-be451cdd1ac9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 598.197540] env[69648]: DEBUG nova.network.neutron [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Refreshing network info cache for port 37641368-6169-4a86-b126-4588176695fa {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 598.201269] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "642ba6f1-b912-4f55-9199-9c98b58ffc1b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 598.201269] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "642ba6f1-b912-4f55-9199-9c98b58ffc1b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 598.203023] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Created folder: Instances in parent group-v692333. [ 598.203161] env[69648]: DEBUG oslo.service.loopingcall [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 598.203902] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 598.204311] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6d8487f0-965f-4b42-af79-ff5308fd07eb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 598.224347] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466476, 'name': CreateVM_Task, 'duration_secs': 0.325707} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 598.224888] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 598.226671] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 598.226671] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 598.226671] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 598.226807] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bedd53e2-7324-45cd-8d97-6fbbdfedc380 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 598.229833] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 598.229833] env[69648]: value = "task-3466479" [ 598.229833] env[69648]: _type = "Task" [ 598.229833] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 598.231521] env[69648]: DEBUG oslo_vmware.api [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for the task: (returnval){ [ 598.231521] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52517da2-6a4f-923d-df3f-401a37a6ac6f" [ 598.231521] env[69648]: _type = "Task" [ 598.231521] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 598.249841] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466479, 'name': CreateVM_Task} progress is 5%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 598.251777] env[69648]: DEBUG oslo_vmware.api [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52517da2-6a4f-923d-df3f-401a37a6ac6f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 598.744108] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466479, 'name': CreateVM_Task} progress is 99%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 598.748090] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 598.748370] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 598.748588] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 599.246659] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466479, 'name': CreateVM_Task} progress is 99%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 599.751725] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466479, 'name': CreateVM_Task, 'duration_secs': 1.355214} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 599.754197] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 599.754197] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 599.754197] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 599.754197] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 599.754197] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-97ce3156-7ca6-434f-b132-02d177834b83 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 599.760298] env[69648]: DEBUG oslo_vmware.api [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Waiting for the task: (returnval){ [ 599.760298] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f274d4-04d9-a411-cc62-622ba24755ae" [ 599.760298] env[69648]: _type = "Task" [ 599.760298] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 599.771513] env[69648]: DEBUG oslo_vmware.api [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f274d4-04d9-a411-cc62-622ba24755ae, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 599.783155] env[69648]: DEBUG nova.network.neutron [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Updated VIF entry in instance network info cache for port 37641368-6169-4a86-b126-4588176695fa. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 599.783514] env[69648]: DEBUG nova.network.neutron [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Updating instance_info_cache with network_info: [{"id": "37641368-6169-4a86-b126-4588176695fa", "address": "fa:16:3e:74:35:4c", "network": {"id": "bfe77ffc-ad8a-4def-bc6f-facdeb8e2d96", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1005180974-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab8e75d74f024404b699476682537d40", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap37641368-61", "ovs_interfaceid": "37641368-6169-4a86-b126-4588176695fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 599.801941] env[69648]: DEBUG oslo_concurrency.lockutils [req-8a85afae-d7e9-4987-af99-13d065f29b1b req-9fbdd3ce-f36d-478a-9512-4ee2d686c7c4 service nova] Releasing lock "refresh_cache-dfbb396b-8f18-456d-9064-be451cdd1ac9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 599.972377] env[69648]: DEBUG nova.compute.manager [req-568423dd-d95c-43e2-877a-6d955253dd21 req-491bddcc-79da-4cfa-8682-f9f6ab2e682c service nova] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Received event network-vif-plugged-cf663f04-efb4-4a4f-910b-56a841b0b037 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 599.972723] env[69648]: DEBUG oslo_concurrency.lockutils [req-568423dd-d95c-43e2-877a-6d955253dd21 req-491bddcc-79da-4cfa-8682-f9f6ab2e682c service nova] Acquiring lock "20bce654-7f57-4de6-8f7a-c1b34286fc86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.972866] env[69648]: DEBUG oslo_concurrency.lockutils [req-568423dd-d95c-43e2-877a-6d955253dd21 req-491bddcc-79da-4cfa-8682-f9f6ab2e682c service nova] Lock "20bce654-7f57-4de6-8f7a-c1b34286fc86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.973063] env[69648]: DEBUG oslo_concurrency.lockutils [req-568423dd-d95c-43e2-877a-6d955253dd21 req-491bddcc-79da-4cfa-8682-f9f6ab2e682c service nova] Lock "20bce654-7f57-4de6-8f7a-c1b34286fc86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 599.974713] env[69648]: DEBUG nova.compute.manager [req-568423dd-d95c-43e2-877a-6d955253dd21 req-491bddcc-79da-4cfa-8682-f9f6ab2e682c service nova] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] No waiting events found dispatching network-vif-plugged-cf663f04-efb4-4a4f-910b-56a841b0b037 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 599.974713] env[69648]: WARNING nova.compute.manager [req-568423dd-d95c-43e2-877a-6d955253dd21 req-491bddcc-79da-4cfa-8682-f9f6ab2e682c service nova] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Received unexpected event network-vif-plugged-cf663f04-efb4-4a4f-910b-56a841b0b037 for instance with vm_state building and task_state spawning. [ 600.275259] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 600.275746] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 600.275986] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 600.606276] env[69648]: DEBUG nova.network.neutron [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Successfully updated port: ddd5fcb8-07b3-4347-b45e-098f0f293ba3 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 600.618403] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "refresh_cache-ed63f202-c76d-4492-b738-606ee1c6b059" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 600.618403] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquired lock "refresh_cache-ed63f202-c76d-4492-b738-606ee1c6b059" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 600.618403] env[69648]: DEBUG nova.network.neutron [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 600.759686] env[69648]: DEBUG nova.network.neutron [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 601.750714] env[69648]: DEBUG nova.network.neutron [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Updating instance_info_cache with network_info: [{"id": "ddd5fcb8-07b3-4347-b45e-098f0f293ba3", "address": "fa:16:3e:6c:9d:31", "network": {"id": "add7e0ae-c621-4a5b-a4fe-c7fc9a267c92", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2032585077-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a3d06fdf00fb4237b20e95cfcdee2af1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd72ef32-a57c-43b0-93df-e8a030987d44", "external-id": "nsx-vlan-transportzone-340", "segmentation_id": 340, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapddd5fcb8-07", "ovs_interfaceid": "ddd5fcb8-07b3-4347-b45e-098f0f293ba3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.764307] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Releasing lock "refresh_cache-ed63f202-c76d-4492-b738-606ee1c6b059" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 601.764793] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Instance network_info: |[{"id": "ddd5fcb8-07b3-4347-b45e-098f0f293ba3", "address": "fa:16:3e:6c:9d:31", "network": {"id": "add7e0ae-c621-4a5b-a4fe-c7fc9a267c92", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2032585077-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a3d06fdf00fb4237b20e95cfcdee2af1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd72ef32-a57c-43b0-93df-e8a030987d44", "external-id": "nsx-vlan-transportzone-340", "segmentation_id": 340, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapddd5fcb8-07", "ovs_interfaceid": "ddd5fcb8-07b3-4347-b45e-098f0f293ba3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 601.765554] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6c:9d:31', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dd72ef32-a57c-43b0-93df-e8a030987d44', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ddd5fcb8-07b3-4347-b45e-098f0f293ba3', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 601.775428] env[69648]: DEBUG oslo.service.loopingcall [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 601.775982] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 601.776322] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-62bdeb26-73ea-4872-bf07-69dd62b5ede2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.802101] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 601.802101] env[69648]: value = "task-3466480" [ 601.802101] env[69648]: _type = "Task" [ 601.802101] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 601.811328] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466480, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 602.078419] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "91fcee48-3466-480d-bf87-bc4de17fbf31" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.078419] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 602.314969] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466480, 'name': CreateVM_Task, 'duration_secs': 0.305971} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 602.315889] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 602.316796] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 602.317073] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 602.317671] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 602.317972] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac040c36-c79f-4649-bfea-19074be4c656 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.323450] env[69648]: DEBUG oslo_vmware.api [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for the task: (returnval){ [ 602.323450] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5214e040-eccb-2711-1001-e56140bd10da" [ 602.323450] env[69648]: _type = "Task" [ 602.323450] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 602.333326] env[69648]: DEBUG oslo_vmware.api [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5214e040-eccb-2711-1001-e56140bd10da, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 602.758784] env[69648]: DEBUG nova.compute.manager [req-2cd2e628-f590-486c-b689-705d1c586bf0 req-9fdeab3b-c09e-4478-a991-40812a6dcf41 service nova] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Received event network-vif-plugged-ddd5fcb8-07b3-4347-b45e-098f0f293ba3 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 602.761025] env[69648]: DEBUG oslo_concurrency.lockutils [req-2cd2e628-f590-486c-b689-705d1c586bf0 req-9fdeab3b-c09e-4478-a991-40812a6dcf41 service nova] Acquiring lock "ed63f202-c76d-4492-b738-606ee1c6b059-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.761025] env[69648]: DEBUG oslo_concurrency.lockutils [req-2cd2e628-f590-486c-b689-705d1c586bf0 req-9fdeab3b-c09e-4478-a991-40812a6dcf41 service nova] Lock "ed63f202-c76d-4492-b738-606ee1c6b059-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 602.761025] env[69648]: DEBUG oslo_concurrency.lockutils [req-2cd2e628-f590-486c-b689-705d1c586bf0 req-9fdeab3b-c09e-4478-a991-40812a6dcf41 service nova] Lock "ed63f202-c76d-4492-b738-606ee1c6b059-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 602.761025] env[69648]: DEBUG nova.compute.manager [req-2cd2e628-f590-486c-b689-705d1c586bf0 req-9fdeab3b-c09e-4478-a991-40812a6dcf41 service nova] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] No waiting events found dispatching network-vif-plugged-ddd5fcb8-07b3-4347-b45e-098f0f293ba3 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 602.761238] env[69648]: WARNING nova.compute.manager [req-2cd2e628-f590-486c-b689-705d1c586bf0 req-9fdeab3b-c09e-4478-a991-40812a6dcf41 service nova] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Received unexpected event network-vif-plugged-ddd5fcb8-07b3-4347-b45e-098f0f293ba3 for instance with vm_state building and task_state spawning. [ 602.837935] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 602.838909] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 602.838909] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 603.812125] env[69648]: DEBUG nova.compute.manager [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Received event network-vif-plugged-91d52ed1-e558-4488-a238-71f89b68ba6c {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 603.812400] env[69648]: DEBUG oslo_concurrency.lockutils [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] Acquiring lock "928bc799-4fed-4005-89d2-e18196f88ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 603.812645] env[69648]: DEBUG oslo_concurrency.lockutils [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] Lock "928bc799-4fed-4005-89d2-e18196f88ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.812960] env[69648]: DEBUG oslo_concurrency.lockutils [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] Lock "928bc799-4fed-4005-89d2-e18196f88ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 603.813051] env[69648]: DEBUG nova.compute.manager [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] No waiting events found dispatching network-vif-plugged-91d52ed1-e558-4488-a238-71f89b68ba6c {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 603.813434] env[69648]: WARNING nova.compute.manager [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Received unexpected event network-vif-plugged-91d52ed1-e558-4488-a238-71f89b68ba6c for instance with vm_state building and task_state spawning. [ 603.813434] env[69648]: DEBUG nova.compute.manager [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Received event network-changed-cf663f04-efb4-4a4f-910b-56a841b0b037 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 603.813613] env[69648]: DEBUG nova.compute.manager [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Refreshing instance network info cache due to event network-changed-cf663f04-efb4-4a4f-910b-56a841b0b037. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 603.813843] env[69648]: DEBUG oslo_concurrency.lockutils [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] Acquiring lock "refresh_cache-20bce654-7f57-4de6-8f7a-c1b34286fc86" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 603.814049] env[69648]: DEBUG oslo_concurrency.lockutils [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] Acquired lock "refresh_cache-20bce654-7f57-4de6-8f7a-c1b34286fc86" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 603.814244] env[69648]: DEBUG nova.network.neutron [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Refreshing network info cache for port cf663f04-efb4-4a4f-910b-56a841b0b037 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 604.368460] env[69648]: DEBUG nova.network.neutron [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Updated VIF entry in instance network info cache for port cf663f04-efb4-4a4f-910b-56a841b0b037. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 604.368460] env[69648]: DEBUG nova.network.neutron [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Updating instance_info_cache with network_info: [{"id": "cf663f04-efb4-4a4f-910b-56a841b0b037", "address": "fa:16:3e:a8:3c:9c", "network": {"id": "add7e0ae-c621-4a5b-a4fe-c7fc9a267c92", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2032585077-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a3d06fdf00fb4237b20e95cfcdee2af1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd72ef32-a57c-43b0-93df-e8a030987d44", "external-id": "nsx-vlan-transportzone-340", "segmentation_id": 340, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf663f04-ef", "ovs_interfaceid": "cf663f04-efb4-4a4f-910b-56a841b0b037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 604.381759] env[69648]: DEBUG oslo_concurrency.lockutils [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] Releasing lock "refresh_cache-20bce654-7f57-4de6-8f7a-c1b34286fc86" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 604.382034] env[69648]: DEBUG nova.compute.manager [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Received event network-changed-91d52ed1-e558-4488-a238-71f89b68ba6c {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 604.382210] env[69648]: DEBUG nova.compute.manager [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Refreshing instance network info cache due to event network-changed-91d52ed1-e558-4488-a238-71f89b68ba6c. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 604.382411] env[69648]: DEBUG oslo_concurrency.lockutils [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] Acquiring lock "refresh_cache-928bc799-4fed-4005-89d2-e18196f88ffb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 604.382553] env[69648]: DEBUG oslo_concurrency.lockutils [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] Acquired lock "refresh_cache-928bc799-4fed-4005-89d2-e18196f88ffb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 604.382711] env[69648]: DEBUG nova.network.neutron [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Refreshing network info cache for port 91d52ed1-e558-4488-a238-71f89b68ba6c {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 604.677853] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 604.678128] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 605.513415] env[69648]: DEBUG nova.network.neutron [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Updated VIF entry in instance network info cache for port 91d52ed1-e558-4488-a238-71f89b68ba6c. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 605.513787] env[69648]: DEBUG nova.network.neutron [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Updating instance_info_cache with network_info: [{"id": "91d52ed1-e558-4488-a238-71f89b68ba6c", "address": "fa:16:3e:41:14:57", "network": {"id": "d56f22c7-9161-428c-8257-7a261f52fc44", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1035741553-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1eee2d0a960e4a56b93f41defb6ff5fe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap91d52ed1-e5", "ovs_interfaceid": "91d52ed1-e558-4488-a238-71f89b68ba6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.525458] env[69648]: DEBUG oslo_concurrency.lockutils [req-5636cf6f-b2b0-4915-889e-4862a68f648a req-21dddc84-2e4f-4803-b562-26305edff500 service nova] Releasing lock "refresh_cache-928bc799-4fed-4005-89d2-e18196f88ffb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 606.081659] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f9ae89c0-a4b6-4e91-939e-1300cc47ee17 tempest-InstanceActionsTestJSON-825947943 tempest-InstanceActionsTestJSON-825947943-project-member] Acquiring lock "741e7ee9-8ee1-4b36-89cc-a640e6f6b97a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 606.081909] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f9ae89c0-a4b6-4e91-939e-1300cc47ee17 tempest-InstanceActionsTestJSON-825947943 tempest-InstanceActionsTestJSON-825947943-project-member] Lock "741e7ee9-8ee1-4b36-89cc-a640e6f6b97a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 607.341105] env[69648]: DEBUG nova.compute.manager [req-048ffd55-f3ab-4bad-bbff-fa091336578b req-66f28b00-3715-42f8-a33d-f8f308165fbd service nova] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Received event network-changed-ddd5fcb8-07b3-4347-b45e-098f0f293ba3 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 607.341105] env[69648]: DEBUG nova.compute.manager [req-048ffd55-f3ab-4bad-bbff-fa091336578b req-66f28b00-3715-42f8-a33d-f8f308165fbd service nova] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Refreshing instance network info cache due to event network-changed-ddd5fcb8-07b3-4347-b45e-098f0f293ba3. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 607.341383] env[69648]: DEBUG oslo_concurrency.lockutils [req-048ffd55-f3ab-4bad-bbff-fa091336578b req-66f28b00-3715-42f8-a33d-f8f308165fbd service nova] Acquiring lock "refresh_cache-ed63f202-c76d-4492-b738-606ee1c6b059" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 607.341383] env[69648]: DEBUG oslo_concurrency.lockutils [req-048ffd55-f3ab-4bad-bbff-fa091336578b req-66f28b00-3715-42f8-a33d-f8f308165fbd service nova] Acquired lock "refresh_cache-ed63f202-c76d-4492-b738-606ee1c6b059" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 607.341615] env[69648]: DEBUG nova.network.neutron [req-048ffd55-f3ab-4bad-bbff-fa091336578b req-66f28b00-3715-42f8-a33d-f8f308165fbd service nova] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Refreshing network info cache for port ddd5fcb8-07b3-4347-b45e-098f0f293ba3 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 607.600330] env[69648]: DEBUG oslo_concurrency.lockutils [None req-334a8787-b1c3-40b5-9de0-7c5ec68f89b6 tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] Acquiring lock "28f086a0-3197-47a9-ad8d-6cfd3a59bfc3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 607.600576] env[69648]: DEBUG oslo_concurrency.lockutils [None req-334a8787-b1c3-40b5-9de0-7c5ec68f89b6 tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] Lock "28f086a0-3197-47a9-ad8d-6cfd3a59bfc3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 607.980975] env[69648]: DEBUG nova.network.neutron [req-048ffd55-f3ab-4bad-bbff-fa091336578b req-66f28b00-3715-42f8-a33d-f8f308165fbd service nova] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Updated VIF entry in instance network info cache for port ddd5fcb8-07b3-4347-b45e-098f0f293ba3. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 607.981381] env[69648]: DEBUG nova.network.neutron [req-048ffd55-f3ab-4bad-bbff-fa091336578b req-66f28b00-3715-42f8-a33d-f8f308165fbd service nova] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Updating instance_info_cache with network_info: [{"id": "ddd5fcb8-07b3-4347-b45e-098f0f293ba3", "address": "fa:16:3e:6c:9d:31", "network": {"id": "add7e0ae-c621-4a5b-a4fe-c7fc9a267c92", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2032585077-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a3d06fdf00fb4237b20e95cfcdee2af1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd72ef32-a57c-43b0-93df-e8a030987d44", "external-id": "nsx-vlan-transportzone-340", "segmentation_id": 340, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapddd5fcb8-07", "ovs_interfaceid": "ddd5fcb8-07b3-4347-b45e-098f0f293ba3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 607.994714] env[69648]: DEBUG oslo_concurrency.lockutils [req-048ffd55-f3ab-4bad-bbff-fa091336578b req-66f28b00-3715-42f8-a33d-f8f308165fbd service nova] Releasing lock "refresh_cache-ed63f202-c76d-4492-b738-606ee1c6b059" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 609.198450] env[69648]: DEBUG oslo_concurrency.lockutils [None req-50bbdc90-ca31-454d-b67a-a3c1c164785e tempest-ServerActionsTestOtherA-1331415416 tempest-ServerActionsTestOtherA-1331415416-project-member] Acquiring lock "ca4953f5-de55-4476-a845-c633a073eb43" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.199103] env[69648]: DEBUG oslo_concurrency.lockutils [None req-50bbdc90-ca31-454d-b67a-a3c1c164785e tempest-ServerActionsTestOtherA-1331415416 tempest-ServerActionsTestOtherA-1331415416-project-member] Lock "ca4953f5-de55-4476-a845-c633a073eb43" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.269183] env[69648]: DEBUG oslo_concurrency.lockutils [None req-549e9643-5f06-4610-b3c5-efe995377b89 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquiring lock "51ebee08-c929-4485-b229-cd99d35db2f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.269441] env[69648]: DEBUG oslo_concurrency.lockutils [None req-549e9643-5f06-4610-b3c5-efe995377b89 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "51ebee08-c929-4485-b229-cd99d35db2f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.845478] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c25321c7-35b8-47f4-bcb9-9be24352aabe tempest-ServerDiagnosticsNegativeTest-607771934 tempest-ServerDiagnosticsNegativeTest-607771934-project-member] Acquiring lock "63c7b328-2c8a-42a6-b340-78528a656f9f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.845836] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c25321c7-35b8-47f4-bcb9-9be24352aabe tempest-ServerDiagnosticsNegativeTest-607771934 tempest-ServerDiagnosticsNegativeTest-607771934-project-member] Lock "63c7b328-2c8a-42a6-b340-78528a656f9f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 610.611377] env[69648]: DEBUG oslo_concurrency.lockutils [None req-023c7526-4e6c-430d-8752-1d26e778481b tempest-VolumesAssistedSnapshotsTest-798663970 tempest-VolumesAssistedSnapshotsTest-798663970-project-member] Acquiring lock "2b75aa03-750c-49b3-b69a-63bfee58942f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 610.611377] env[69648]: DEBUG oslo_concurrency.lockutils [None req-023c7526-4e6c-430d-8752-1d26e778481b tempest-VolumesAssistedSnapshotsTest-798663970 tempest-VolumesAssistedSnapshotsTest-798663970-project-member] Lock "2b75aa03-750c-49b3-b69a-63bfee58942f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 610.625509] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0acf2512-9ceb-49fd-b443-30cc5a5afecc tempest-ServerMetadataTestJSON-622148251 tempest-ServerMetadataTestJSON-622148251-project-member] Acquiring lock "b21a0b12-14c7-491f-ae08-f596924490d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 610.626325] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0acf2512-9ceb-49fd-b443-30cc5a5afecc tempest-ServerMetadataTestJSON-622148251 tempest-ServerMetadataTestJSON-622148251-project-member] Lock "b21a0b12-14c7-491f-ae08-f596924490d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 612.810592] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1b67cd95-2280-4587-b9c1-43459d5a801e tempest-ImagesOneServerTestJSON-509649065 tempest-ImagesOneServerTestJSON-509649065-project-member] Acquiring lock "267be1e0-7768-421d-9ae4-f8acb5331b23" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 612.810946] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1b67cd95-2280-4587-b9c1-43459d5a801e tempest-ImagesOneServerTestJSON-509649065 tempest-ImagesOneServerTestJSON-509649065-project-member] Lock "267be1e0-7768-421d-9ae4-f8acb5331b23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.621524] env[69648]: DEBUG oslo_concurrency.lockutils [None req-17192bcf-4759-495b-b0f0-a84d6adb918d tempest-ServersWithSpecificFlavorTestJSON-1998279834 tempest-ServersWithSpecificFlavorTestJSON-1998279834-project-member] Acquiring lock "50dd8f44-95ae-4b0c-ad88-2f11f4886d57" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 613.622081] env[69648]: DEBUG oslo_concurrency.lockutils [None req-17192bcf-4759-495b-b0f0-a84d6adb918d tempest-ServersWithSpecificFlavorTestJSON-1998279834 tempest-ServersWithSpecificFlavorTestJSON-1998279834-project-member] Lock "50dd8f44-95ae-4b0c-ad88-2f11f4886d57" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 618.796859] env[69648]: DEBUG oslo_concurrency.lockutils [None req-2327f0bc-09d4-4f66-bbd4-609e316788e6 tempest-ImagesNegativeTestJSON-660182022 tempest-ImagesNegativeTestJSON-660182022-project-member] Acquiring lock "613910dc-5ba5-482f-8b77-bf978fe622dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 618.796859] env[69648]: DEBUG oslo_concurrency.lockutils [None req-2327f0bc-09d4-4f66-bbd4-609e316788e6 tempest-ImagesNegativeTestJSON-660182022 tempest-ImagesNegativeTestJSON-660182022-project-member] Lock "613910dc-5ba5-482f-8b77-bf978fe622dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 619.284758] env[69648]: WARNING oslo_vmware.rw_handles [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 619.284758] env[69648]: ERROR oslo_vmware.rw_handles [ 619.285308] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/0863602a-4ecc-4327-83a8-f22975dbf813/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 619.286633] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 619.286961] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Copying Virtual Disk [datastore1] vmware_temp/0863602a-4ecc-4327-83a8-f22975dbf813/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/0863602a-4ecc-4327-83a8-f22975dbf813/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 619.289679] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6acb5826-8d5d-4ff1-b863-b042db3cf0ab {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.298608] env[69648]: DEBUG oslo_vmware.api [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Waiting for the task: (returnval){ [ 619.298608] env[69648]: value = "task-3466481" [ 619.298608] env[69648]: _type = "Task" [ 619.298608] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 619.306821] env[69648]: DEBUG oslo_vmware.api [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Task: {'id': task-3466481, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 619.812548] env[69648]: DEBUG oslo_vmware.exceptions [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 619.812923] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 619.817886] env[69648]: ERROR nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 619.817886] env[69648]: Faults: ['InvalidArgument'] [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Traceback (most recent call last): [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] yield resources [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] self.driver.spawn(context, instance, image_meta, [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] self._vmops.spawn(context, instance, image_meta, injected_files, [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] self._fetch_image_if_missing(context, vi) [ 619.817886] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] image_cache(vi, tmp_image_ds_loc) [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] vm_util.copy_virtual_disk( [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] session._wait_for_task(vmdk_copy_task) [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] return self.wait_for_task(task_ref) [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] return evt.wait() [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] result = hub.switch() [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 619.818308] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] return self.greenlet.switch() [ 619.818745] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 619.818745] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] self.f(*self.args, **self.kw) [ 619.818745] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 619.818745] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] raise exceptions.translate_fault(task_info.error) [ 619.818745] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 619.818745] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Faults: ['InvalidArgument'] [ 619.818745] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] [ 619.818745] env[69648]: INFO nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Terminating instance [ 619.820093] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 619.820316] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 619.821474] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquiring lock "refresh_cache-de093ae4-0e4c-49e8-9beb-c61501c5c705" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 619.821474] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquired lock "refresh_cache-de093ae4-0e4c-49e8-9beb-c61501c5c705" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 619.821474] env[69648]: DEBUG nova.network.neutron [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 619.822070] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-61d32ba6-05c0-4b1a-b758-7be31b6fcf52 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.830748] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 619.830833] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 619.831582] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-564b621c-7ed1-44f1-8c9c-4cd669951af6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.840403] env[69648]: DEBUG oslo_vmware.api [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Waiting for the task: (returnval){ [ 619.840403] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52010595-e163-200b-d344-88bf92a72979" [ 619.840403] env[69648]: _type = "Task" [ 619.840403] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 619.848454] env[69648]: DEBUG oslo_vmware.api [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52010595-e163-200b-d344-88bf92a72979, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 619.877153] env[69648]: DEBUG nova.network.neutron [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 619.979148] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d6d75ef3-98ae-4962-8998-48cd9e466ac4 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] Acquiring lock "6e484ac2-6437-488a-97e2-f5dedb5816c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 619.979148] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d6d75ef3-98ae-4962-8998-48cd9e466ac4 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] Lock "6e484ac2-6437-488a-97e2-f5dedb5816c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 620.117330] env[69648]: DEBUG nova.network.neutron [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.131365] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Releasing lock "refresh_cache-de093ae4-0e4c-49e8-9beb-c61501c5c705" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 620.131810] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 620.132010] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 620.133723] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-890790f8-9b96-41fc-aa19-d94cd67901d8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.143412] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 620.143669] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1d05f094-448f-4f57-9be5-5f5f523a2969 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.150112] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0953c5eb-55fb-402c-829e-7d380eafab98 tempest-ServersAdminNegativeTestJSON-819959570 tempest-ServersAdminNegativeTestJSON-819959570-project-member] Acquiring lock "e0b188fb-3ec8-46c7-8966-ea4eaef2430b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 620.150336] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0953c5eb-55fb-402c-829e-7d380eafab98 tempest-ServersAdminNegativeTestJSON-819959570 tempest-ServersAdminNegativeTestJSON-819959570-project-member] Lock "e0b188fb-3ec8-46c7-8966-ea4eaef2430b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 620.171733] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 620.171955] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 620.172177] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Deleting the datastore file [datastore1] de093ae4-0e4c-49e8-9beb-c61501c5c705 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 620.172434] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-315e8808-9b0c-4743-ad12-4a840a2e496e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.179693] env[69648]: DEBUG oslo_vmware.api [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Waiting for the task: (returnval){ [ 620.179693] env[69648]: value = "task-3466483" [ 620.179693] env[69648]: _type = "Task" [ 620.179693] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 620.190774] env[69648]: DEBUG oslo_vmware.api [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Task: {'id': task-3466483, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 620.354401] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 620.354675] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Creating directory with path [datastore1] vmware_temp/e15a75af-d0d3-4abb-a3be-e516ddacf0c0/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 620.354922] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-28c98ed2-1445-4fa1-8440-92a0317e3c04 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.366552] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Created directory with path [datastore1] vmware_temp/e15a75af-d0d3-4abb-a3be-e516ddacf0c0/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 620.367766] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Fetch image to [datastore1] vmware_temp/e15a75af-d0d3-4abb-a3be-e516ddacf0c0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 620.367766] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/e15a75af-d0d3-4abb-a3be-e516ddacf0c0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 620.368137] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25ad67a8-fd93-4a8d-8cd3-87051b8d5be7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.377055] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5486ed09-caaa-4ed5-a072-8796aa755f44 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.386686] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbc6c03e-f062-4792-9ffd-b7c3241d026e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.419815] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7696f3b-cee9-4b93-8fe4-f051ee206a9e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.426119] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ce92ef20-84ae-496f-b1fb-d0fb94d4c525 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.457496] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 620.541362] env[69648]: DEBUG oslo_vmware.rw_handles [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e15a75af-d0d3-4abb-a3be-e516ddacf0c0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 620.618229] env[69648]: DEBUG oslo_vmware.rw_handles [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 620.618229] env[69648]: DEBUG oslo_vmware.rw_handles [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e15a75af-d0d3-4abb-a3be-e516ddacf0c0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 620.692303] env[69648]: DEBUG oslo_vmware.api [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Task: {'id': task-3466483, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.047892} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 620.692303] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 620.692303] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 620.692303] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 620.692670] env[69648]: INFO nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Took 0.56 seconds to destroy the instance on the hypervisor. [ 620.692914] env[69648]: DEBUG oslo.service.loopingcall [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 620.693135] env[69648]: DEBUG nova.compute.manager [-] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 620.700714] env[69648]: DEBUG nova.compute.claims [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 620.700714] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 620.700714] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 621.214385] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce3f9243-d88e-47ad-b4ee-61b69ec14fab {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.223277] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edd997a9-0872-41e8-af66-0a75503768f4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.256612] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c63b2cca-5fba-46cc-8189-ce7f60819a28 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.264007] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d218e0d-eb95-4622-9a3f-91fb5eb0ea7f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.280820] env[69648]: DEBUG nova.compute.provider_tree [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 621.297028] env[69648]: DEBUG nova.scheduler.client.report [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 621.312107] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.612s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 621.312736] env[69648]: ERROR nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 621.312736] env[69648]: Faults: ['InvalidArgument'] [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Traceback (most recent call last): [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] self.driver.spawn(context, instance, image_meta, [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] self._vmops.spawn(context, instance, image_meta, injected_files, [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] self._fetch_image_if_missing(context, vi) [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] image_cache(vi, tmp_image_ds_loc) [ 621.312736] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] vm_util.copy_virtual_disk( [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] session._wait_for_task(vmdk_copy_task) [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] return self.wait_for_task(task_ref) [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] return evt.wait() [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] result = hub.switch() [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] return self.greenlet.switch() [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 621.313141] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] self.f(*self.args, **self.kw) [ 621.313517] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 621.313517] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] raise exceptions.translate_fault(task_info.error) [ 621.313517] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 621.313517] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Faults: ['InvalidArgument'] [ 621.313517] env[69648]: ERROR nova.compute.manager [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] [ 621.315366] env[69648]: DEBUG nova.compute.utils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 621.318457] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Build of instance de093ae4-0e4c-49e8-9beb-c61501c5c705 was re-scheduled: A specified parameter was not correct: fileType [ 621.318457] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 621.318898] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 621.319307] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquiring lock "refresh_cache-de093ae4-0e4c-49e8-9beb-c61501c5c705" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 621.319507] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Acquired lock "refresh_cache-de093ae4-0e4c-49e8-9beb-c61501c5c705" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 621.319705] env[69648]: DEBUG nova.network.neutron [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 621.374314] env[69648]: DEBUG nova.network.neutron [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 621.531279] env[69648]: DEBUG nova.network.neutron [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 621.546461] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Releasing lock "refresh_cache-de093ae4-0e4c-49e8-9beb-c61501c5c705" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 621.546710] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 621.546896] env[69648]: DEBUG nova.compute.manager [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 621.725179] env[69648]: INFO nova.scheduler.client.report [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Deleted allocations for instance de093ae4-0e4c-49e8-9beb-c61501c5c705 [ 621.765532] env[69648]: DEBUG oslo_concurrency.lockutils [None req-36b60c48-3569-4438-be38-dd6a7af01c51 tempest-ServersAdmin275Test-2007197445 tempest-ServersAdmin275Test-2007197445-project-member] Lock "de093ae4-0e4c-49e8-9beb-c61501c5c705" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 50.744s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 621.767578] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "de093ae4-0e4c-49e8-9beb-c61501c5c705" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 47.768s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 621.767578] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: de093ae4-0e4c-49e8-9beb-c61501c5c705] During sync_power_state the instance has a pending task (spawning). Skip. [ 621.767578] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "de093ae4-0e4c-49e8-9beb-c61501c5c705" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 621.787527] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 621.853943] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 621.854210] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 621.855750] env[69648]: INFO nova.compute.claims [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 622.360605] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e2bebef-3614-45a0-954a-3de8c0f639a5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.368733] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd109117-e94c-4c49-bce5-592c90a44233 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.402922] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7506b955-bc49-43e1-a9fe-1b4ad7ba65ad {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.411175] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da4a0f4f-e53c-44f7-a2a7-ae97e1ee675e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.425702] env[69648]: DEBUG nova.compute.provider_tree [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 622.438023] env[69648]: DEBUG nova.scheduler.client.report [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 622.457176] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.601s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 622.457176] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 622.507673] env[69648]: DEBUG nova.compute.utils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 622.508933] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 622.509501] env[69648]: DEBUG nova.network.neutron [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 622.522851] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 622.607972] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 622.640221] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 622.640539] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 622.640751] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 622.641454] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 622.641454] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 622.641454] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 622.641756] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 622.641957] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 622.642553] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 622.642553] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 622.642553] env[69648]: DEBUG nova.virt.hardware [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 622.643691] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-399b766b-09e2-4aa1-984a-6fa1956f06cb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.653740] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bee15b0-8011-4ac6-9e82-0be08f191325 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.682997] env[69648]: DEBUG nova.policy [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd02320b12288496eae0a735447321a7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '896367398859465488fc12205d122a4e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 623.378434] env[69648]: DEBUG nova.network.neutron [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Successfully created port: 347461f4-676d-42d4-8ca2-2ad1b83eb1b7 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 623.791452] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquiring lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 623.792019] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 624.194113] env[69648]: DEBUG nova.network.neutron [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Successfully updated port: 347461f4-676d-42d4-8ca2-2ad1b83eb1b7 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 624.212948] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "refresh_cache-1756fcf7-3d68-4d02-9a66-619d0a1a9505" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 624.212948] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired lock "refresh_cache-1756fcf7-3d68-4d02-9a66-619d0a1a9505" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 624.212948] env[69648]: DEBUG nova.network.neutron [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 624.257173] env[69648]: DEBUG nova.network.neutron [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 624.722826] env[69648]: DEBUG nova.network.neutron [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Updating instance_info_cache with network_info: [{"id": "347461f4-676d-42d4-8ca2-2ad1b83eb1b7", "address": "fa:16:3e:f8:2f:77", "network": {"id": "58c9db42-24fd-4615-9f91-977554db657a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-27923875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "896367398859465488fc12205d122a4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap347461f4-67", "ovs_interfaceid": "347461f4-676d-42d4-8ca2-2ad1b83eb1b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 624.741991] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Releasing lock "refresh_cache-1756fcf7-3d68-4d02-9a66-619d0a1a9505" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 624.742347] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Instance network_info: |[{"id": "347461f4-676d-42d4-8ca2-2ad1b83eb1b7", "address": "fa:16:3e:f8:2f:77", "network": {"id": "58c9db42-24fd-4615-9f91-977554db657a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-27923875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "896367398859465488fc12205d122a4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap347461f4-67", "ovs_interfaceid": "347461f4-676d-42d4-8ca2-2ad1b83eb1b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 624.742762] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f8:2f:77', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '52f465cb-7418-4172-bd7d-aec00abeb692', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '347461f4-676d-42d4-8ca2-2ad1b83eb1b7', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 624.753033] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating folder: Project (896367398859465488fc12205d122a4e). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 624.753981] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a60b70a5-7b0a-451d-ba2d-d0cae634e7a4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.769645] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Created folder: Project (896367398859465488fc12205d122a4e) in parent group-v692308. [ 624.769645] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating folder: Instances. Parent ref: group-v692337. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 624.769815] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4e57560c-7f58-4c36-b566-6f55bd9edfca {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.781905] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Created folder: Instances in parent group-v692337. [ 624.781905] env[69648]: DEBUG oslo.service.loopingcall [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 624.781905] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 624.781905] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4bf92c1d-5c66-4ae4-a891-989e0da04723 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.800171] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 624.800171] env[69648]: value = "task-3466486" [ 624.800171] env[69648]: _type = "Task" [ 624.800171] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 624.808954] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466486, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 624.850545] env[69648]: DEBUG nova.compute.manager [req-2c9b36e4-54bc-48c7-92fd-90e0637059e0 req-ed502180-1e28-4b45-9c5f-4a3fa5c63807 service nova] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Received event network-vif-plugged-347461f4-676d-42d4-8ca2-2ad1b83eb1b7 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 624.850545] env[69648]: DEBUG oslo_concurrency.lockutils [req-2c9b36e4-54bc-48c7-92fd-90e0637059e0 req-ed502180-1e28-4b45-9c5f-4a3fa5c63807 service nova] Acquiring lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 624.850545] env[69648]: DEBUG oslo_concurrency.lockutils [req-2c9b36e4-54bc-48c7-92fd-90e0637059e0 req-ed502180-1e28-4b45-9c5f-4a3fa5c63807 service nova] Lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 624.850545] env[69648]: DEBUG oslo_concurrency.lockutils [req-2c9b36e4-54bc-48c7-92fd-90e0637059e0 req-ed502180-1e28-4b45-9c5f-4a3fa5c63807 service nova] Lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 624.850781] env[69648]: DEBUG nova.compute.manager [req-2c9b36e4-54bc-48c7-92fd-90e0637059e0 req-ed502180-1e28-4b45-9c5f-4a3fa5c63807 service nova] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] No waiting events found dispatching network-vif-plugged-347461f4-676d-42d4-8ca2-2ad1b83eb1b7 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 624.850781] env[69648]: WARNING nova.compute.manager [req-2c9b36e4-54bc-48c7-92fd-90e0637059e0 req-ed502180-1e28-4b45-9c5f-4a3fa5c63807 service nova] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Received unexpected event network-vif-plugged-347461f4-676d-42d4-8ca2-2ad1b83eb1b7 for instance with vm_state building and task_state spawning. [ 625.311516] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466486, 'name': CreateVM_Task, 'duration_secs': 0.305651} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 625.311703] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 625.312707] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 625.312859] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 625.313342] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 625.313647] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b34d7469-577a-44f3-8666-4298a0de2550 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.318720] env[69648]: DEBUG oslo_vmware.api [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 625.318720] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52b2382d-526a-d919-fb8d-a0b85eb58dcd" [ 625.318720] env[69648]: _type = "Task" [ 625.318720] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 625.327031] env[69648]: DEBUG oslo_vmware.api [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52b2382d-526a-d919-fb8d-a0b85eb58dcd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 625.829479] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 625.829778] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 625.829936] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 627.078765] env[69648]: DEBUG nova.compute.manager [req-2b027071-7af9-4619-97d0-c02b69dc80fb req-780ef129-0553-4979-b8d1-846c033d9a45 service nova] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Received event network-changed-347461f4-676d-42d4-8ca2-2ad1b83eb1b7 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 627.078765] env[69648]: DEBUG nova.compute.manager [req-2b027071-7af9-4619-97d0-c02b69dc80fb req-780ef129-0553-4979-b8d1-846c033d9a45 service nova] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Refreshing instance network info cache due to event network-changed-347461f4-676d-42d4-8ca2-2ad1b83eb1b7. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 627.078765] env[69648]: DEBUG oslo_concurrency.lockutils [req-2b027071-7af9-4619-97d0-c02b69dc80fb req-780ef129-0553-4979-b8d1-846c033d9a45 service nova] Acquiring lock "refresh_cache-1756fcf7-3d68-4d02-9a66-619d0a1a9505" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 627.078765] env[69648]: DEBUG oslo_concurrency.lockutils [req-2b027071-7af9-4619-97d0-c02b69dc80fb req-780ef129-0553-4979-b8d1-846c033d9a45 service nova] Acquired lock "refresh_cache-1756fcf7-3d68-4d02-9a66-619d0a1a9505" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 627.079464] env[69648]: DEBUG nova.network.neutron [req-2b027071-7af9-4619-97d0-c02b69dc80fb req-780ef129-0553-4979-b8d1-846c033d9a45 service nova] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Refreshing network info cache for port 347461f4-676d-42d4-8ca2-2ad1b83eb1b7 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 627.480196] env[69648]: DEBUG nova.network.neutron [req-2b027071-7af9-4619-97d0-c02b69dc80fb req-780ef129-0553-4979-b8d1-846c033d9a45 service nova] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Updated VIF entry in instance network info cache for port 347461f4-676d-42d4-8ca2-2ad1b83eb1b7. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 627.480196] env[69648]: DEBUG nova.network.neutron [req-2b027071-7af9-4619-97d0-c02b69dc80fb req-780ef129-0553-4979-b8d1-846c033d9a45 service nova] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Updating instance_info_cache with network_info: [{"id": "347461f4-676d-42d4-8ca2-2ad1b83eb1b7", "address": "fa:16:3e:f8:2f:77", "network": {"id": "58c9db42-24fd-4615-9f91-977554db657a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-27923875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "896367398859465488fc12205d122a4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap347461f4-67", "ovs_interfaceid": "347461f4-676d-42d4-8ca2-2ad1b83eb1b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.489639] env[69648]: DEBUG oslo_concurrency.lockutils [req-2b027071-7af9-4619-97d0-c02b69dc80fb req-780ef129-0553-4979-b8d1-846c033d9a45 service nova] Releasing lock "refresh_cache-1756fcf7-3d68-4d02-9a66-619d0a1a9505" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 629.106573] env[69648]: DEBUG oslo_concurrency.lockutils [None req-72356b40-a78c-4c8a-b09e-7707263e8788 tempest-ServerGroupTestJSON-191439854 tempest-ServerGroupTestJSON-191439854-project-member] Acquiring lock "c411e626-dff7-4999-8e69-9716f322d518" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 629.107320] env[69648]: DEBUG oslo_concurrency.lockutils [None req-72356b40-a78c-4c8a-b09e-7707263e8788 tempest-ServerGroupTestJSON-191439854 tempest-ServerGroupTestJSON-191439854-project-member] Lock "c411e626-dff7-4999-8e69-9716f322d518" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 634.351956] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b847d18d-9172-45f8-ab9a-4a9d0c89a846 tempest-ServerMetadataNegativeTestJSON-2070852957 tempest-ServerMetadataNegativeTestJSON-2070852957-project-member] Acquiring lock "2c03e5ce-0ebd-40b3-982a-f0f7d4742dde" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 634.352265] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b847d18d-9172-45f8-ab9a-4a9d0c89a846 tempest-ServerMetadataNegativeTestJSON-2070852957 tempest-ServerMetadataNegativeTestJSON-2070852957-project-member] Lock "2c03e5ce-0ebd-40b3-982a-f0f7d4742dde" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.033266] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7ce956e2-48f9-4060-b92a-2477aab38cb5 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] Acquiring lock "72591ff3-6bb9-4b0a-9f38-fa6111f74408" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.033558] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7ce956e2-48f9-4060-b92a-2477aab38cb5 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] Lock "72591ff3-6bb9-4b0a-9f38-fa6111f74408" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.332827] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] Acquiring lock "19e7f1b0-5cd9-453f-8600-a7d76487de87" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.333131] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] Lock "19e7f1b0-5cd9-453f-8600-a7d76487de87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.364253] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] Acquiring lock "2dd9d8b9-8944-43cf-989b-e07354e29d40" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.364501] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] Lock "2dd9d8b9-8944-43cf-989b-e07354e29d40" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.409361] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] Acquiring lock "dd0a5705-5745-439a-9fe2-23b852b86c2c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.409627] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] Lock "dd0a5705-5745-439a-9fe2-23b852b86c2c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.678633] env[69648]: DEBUG oslo_concurrency.lockutils [None req-778af14a-f1b6-46b9-890c-f5cc47f441fd tempest-InstanceActionsV221TestJSON-1043921942 tempest-InstanceActionsV221TestJSON-1043921942-project-member] Acquiring lock "12ddb227-b0e5-47cf-92b0-5c7338c1120e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.679128] env[69648]: DEBUG oslo_concurrency.lockutils [None req-778af14a-f1b6-46b9-890c-f5cc47f441fd tempest-InstanceActionsV221TestJSON-1043921942 tempest-InstanceActionsV221TestJSON-1043921942-project-member] Lock "12ddb227-b0e5-47cf-92b0-5c7338c1120e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.482712] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.482712] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.507355] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.507355] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 643.507584] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 643.529795] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.530017] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.530201] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 54630c78-200e-4b36-8612-34f411e08821] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.530361] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.530515] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.530666] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.530831] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.530968] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.531104] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.531228] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 643.531360] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 643.532113] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 644.064849] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 644.065126] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 644.065303] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 645.064897] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 645.065285] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 645.065350] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 645.065506] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 645.076997] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 645.077265] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 645.077440] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 645.077594] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 645.078737] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30493e72-99f8-4bd9-863b-b19d6591aaa3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.087739] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc8c32d5-919f-42f9-a479-214fdcdcea6e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.104642] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61163ccb-9c5a-460c-a210-0e75331ae60f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.110074] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14a12fa9-4116-4b83-b731-9ca9af8bea2d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.140104] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180948MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 645.140303] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 645.140509] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 645.217745] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 549c349e-5417-408c-acb2-93e506476e2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.217922] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d2e78734-c619-43ab-bdad-bc18cc78c5e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.218065] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 54630c78-200e-4b36-8612-34f411e08821 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.218219] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.218357] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ce04d2df-8587-4cda-93b1-cad7ba3ff670 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.218481] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance dfbb396b-8f18-456d-9064-be451cdd1ac9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.218604] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 20bce654-7f57-4de6-8f7a-c1b34286fc86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.218722] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 928bc799-4fed-4005-89d2-e18196f88ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.218839] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ed63f202-c76d-4492-b738-606ee1c6b059 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.218954] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 645.244225] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.268445] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.278654] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.288852] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 741e7ee9-8ee1-4b36-89cc-a640e6f6b97a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.300342] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 28f086a0-3197-47a9-ad8d-6cfd3a59bfc3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.309811] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ca4953f5-de55-4476-a845-c633a073eb43 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.332798] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 51ebee08-c929-4485-b229-cd99d35db2f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.343641] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63c7b328-2c8a-42a6-b340-78528a656f9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.354270] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2b75aa03-750c-49b3-b69a-63bfee58942f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.365024] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b21a0b12-14c7-491f-ae08-f596924490d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.373872] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 267be1e0-7768-421d-9ae4-f8acb5331b23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.383742] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 50dd8f44-95ae-4b0c-ad88-2f11f4886d57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.393104] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 613910dc-5ba5-482f-8b77-bf978fe622dd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.403495] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6e484ac2-6437-488a-97e2-f5dedb5816c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.412669] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e0b188fb-3ec8-46c7-8966-ea4eaef2430b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.421852] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.431628] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c411e626-dff7-4999-8e69-9716f322d518 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.440727] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2c03e5ce-0ebd-40b3-982a-f0f7d4742dde has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.449805] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 72591ff3-6bb9-4b0a-9f38-fa6111f74408 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.459917] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 19e7f1b0-5cd9-453f-8600-a7d76487de87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.469188] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2dd9d8b9-8944-43cf-989b-e07354e29d40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.478548] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance dd0a5705-5745-439a-9fe2-23b852b86c2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.488338] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 12ddb227-b0e5-47cf-92b0-5c7338c1120e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 645.488622] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 645.488742] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 645.861636] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3f97473-3da1-41c3-82a4-8eaf3c5c7826 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.872927] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-163bf53c-f4ef-4e03-86b9-6dfa8df3d63d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.906227] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef0f3df1-7bc6-43ef-a51e-69132722acd8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.913404] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2edf1867-597d-41b3-a42e-029a2c55fc76 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.926451] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 645.935399] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 645.950023] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 645.950023] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 669.340643] env[69648]: WARNING oslo_vmware.rw_handles [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 669.340643] env[69648]: ERROR oslo_vmware.rw_handles [ 669.341550] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/e15a75af-d0d3-4abb-a3be-e516ddacf0c0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 669.342884] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 669.343203] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Copying Virtual Disk [datastore1] vmware_temp/e15a75af-d0d3-4abb-a3be-e516ddacf0c0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/e15a75af-d0d3-4abb-a3be-e516ddacf0c0/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 669.343551] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9383de29-dcc1-4b58-8100-9137f04cc002 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.352710] env[69648]: DEBUG oslo_vmware.api [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Waiting for the task: (returnval){ [ 669.352710] env[69648]: value = "task-3466487" [ 669.352710] env[69648]: _type = "Task" [ 669.352710] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 669.360935] env[69648]: DEBUG oslo_vmware.api [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Task: {'id': task-3466487, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 669.863626] env[69648]: DEBUG oslo_vmware.exceptions [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 669.863923] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 669.864486] env[69648]: ERROR nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 669.864486] env[69648]: Faults: ['InvalidArgument'] [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Traceback (most recent call last): [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] yield resources [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] self.driver.spawn(context, instance, image_meta, [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] self._fetch_image_if_missing(context, vi) [ 669.864486] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] image_cache(vi, tmp_image_ds_loc) [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] vm_util.copy_virtual_disk( [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] session._wait_for_task(vmdk_copy_task) [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] return self.wait_for_task(task_ref) [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] return evt.wait() [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] result = hub.switch() [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 669.864924] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] return self.greenlet.switch() [ 669.865561] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 669.865561] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] self.f(*self.args, **self.kw) [ 669.865561] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 669.865561] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] raise exceptions.translate_fault(task_info.error) [ 669.865561] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 669.865561] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Faults: ['InvalidArgument'] [ 669.865561] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] [ 669.865561] env[69648]: INFO nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Terminating instance [ 669.866828] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 669.866828] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 669.866967] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-80e8fba9-8c2e-4d1b-a655-d8654fd88ce0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.869152] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquiring lock "refresh_cache-d2e78734-c619-43ab-bdad-bc18cc78c5e5" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 669.869313] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquired lock "refresh_cache-d2e78734-c619-43ab-bdad-bc18cc78c5e5" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 669.869482] env[69648]: DEBUG nova.network.neutron [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 669.876233] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 669.876559] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 669.877598] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-11dd38b3-1c56-4d75-9e3a-7c2b0cba7f5d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.885505] env[69648]: DEBUG oslo_vmware.api [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Waiting for the task: (returnval){ [ 669.885505] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52d6a2c7-de9f-e1e6-cd7a-8b51bee67e5c" [ 669.885505] env[69648]: _type = "Task" [ 669.885505] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 669.895704] env[69648]: DEBUG oslo_vmware.api [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52d6a2c7-de9f-e1e6-cd7a-8b51bee67e5c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 669.900104] env[69648]: DEBUG nova.network.neutron [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 669.989933] env[69648]: DEBUG nova.network.neutron [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.999792] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Releasing lock "refresh_cache-d2e78734-c619-43ab-bdad-bc18cc78c5e5" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 670.000269] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 670.000503] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 670.001616] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8ed1768-19fe-4ba6-a6b0-3386361f5cfc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.009821] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 670.009942] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3423e383-1182-42c6-b854-0bf29c37e420 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.046520] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 670.046520] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 670.046520] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Deleting the datastore file [datastore1] d2e78734-c619-43ab-bdad-bc18cc78c5e5 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 670.046520] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-99ee1531-078f-4102-afe0-6ea908a90598 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.053020] env[69648]: DEBUG oslo_vmware.api [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Waiting for the task: (returnval){ [ 670.053020] env[69648]: value = "task-3466489" [ 670.053020] env[69648]: _type = "Task" [ 670.053020] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 670.060938] env[69648]: DEBUG oslo_vmware.api [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Task: {'id': task-3466489, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 670.395789] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 670.396186] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Creating directory with path [datastore1] vmware_temp/68df1c06-6302-4955-81bc-1ad5811df88e/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 670.396275] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fd13bd92-c7c7-4d75-9dfe-eae9d776d3eb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.407967] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Created directory with path [datastore1] vmware_temp/68df1c06-6302-4955-81bc-1ad5811df88e/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 670.408094] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Fetch image to [datastore1] vmware_temp/68df1c06-6302-4955-81bc-1ad5811df88e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 670.408267] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/68df1c06-6302-4955-81bc-1ad5811df88e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 670.409043] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-876f97f3-08df-49ee-84da-d79a918b653b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.415562] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f5cc0d9-2e98-4d1b-a275-25bed6b25c90 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.424515] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-811e5fbd-3650-4927-bc6d-d9b047111e98 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.455614] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b8f7a30-28af-411a-b54b-048212c0550b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.461390] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-acdc5d3c-32ff-4b02-9bae-238785ca7da3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.481795] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 670.541764] env[69648]: DEBUG oslo_vmware.rw_handles [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/68df1c06-6302-4955-81bc-1ad5811df88e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 670.604423] env[69648]: DEBUG oslo_vmware.rw_handles [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 670.604654] env[69648]: DEBUG oslo_vmware.rw_handles [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/68df1c06-6302-4955-81bc-1ad5811df88e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 670.608745] env[69648]: DEBUG oslo_vmware.api [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Task: {'id': task-3466489, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.046722} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 670.609053] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 670.609404] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 670.609541] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 670.609757] env[69648]: INFO nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Took 0.61 seconds to destroy the instance on the hypervisor. [ 670.610014] env[69648]: DEBUG oslo.service.loopingcall [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 670.610262] env[69648]: DEBUG nova.compute.manager [-] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 670.615875] env[69648]: DEBUG nova.compute.claims [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 670.616061] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.616280] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 671.038563] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf473226-e011-405d-8ae4-1fce90cec63e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.046428] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-437108a1-1689-4ea9-bff3-9079c534b8df {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.077259] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c2262a8-a888-4616-b1b4-f80b6c7d55a3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.084572] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18bb2c94-9bfb-445b-835d-c00eaab85a88 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.098449] env[69648]: DEBUG nova.compute.provider_tree [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 671.106982] env[69648]: DEBUG nova.scheduler.client.report [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 671.122060] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.505s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 671.122396] env[69648]: ERROR nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 671.122396] env[69648]: Faults: ['InvalidArgument'] [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Traceback (most recent call last): [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] self.driver.spawn(context, instance, image_meta, [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] self._fetch_image_if_missing(context, vi) [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] image_cache(vi, tmp_image_ds_loc) [ 671.122396] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] vm_util.copy_virtual_disk( [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] session._wait_for_task(vmdk_copy_task) [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] return self.wait_for_task(task_ref) [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] return evt.wait() [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] result = hub.switch() [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] return self.greenlet.switch() [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 671.122802] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] self.f(*self.args, **self.kw) [ 671.123166] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 671.123166] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] raise exceptions.translate_fault(task_info.error) [ 671.123166] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 671.123166] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Faults: ['InvalidArgument'] [ 671.123166] env[69648]: ERROR nova.compute.manager [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] [ 671.123166] env[69648]: DEBUG nova.compute.utils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 671.124873] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Build of instance d2e78734-c619-43ab-bdad-bc18cc78c5e5 was re-scheduled: A specified parameter was not correct: fileType [ 671.124873] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 671.125334] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 671.125580] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquiring lock "refresh_cache-d2e78734-c619-43ab-bdad-bc18cc78c5e5" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 671.125731] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Acquired lock "refresh_cache-d2e78734-c619-43ab-bdad-bc18cc78c5e5" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 671.125894] env[69648]: DEBUG nova.network.neutron [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 671.150343] env[69648]: DEBUG nova.network.neutron [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 671.255228] env[69648]: DEBUG nova.network.neutron [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 671.267026] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Releasing lock "refresh_cache-d2e78734-c619-43ab-bdad-bc18cc78c5e5" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 671.267026] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 671.267026] env[69648]: DEBUG nova.compute.manager [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 671.367404] env[69648]: INFO nova.scheduler.client.report [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Deleted allocations for instance d2e78734-c619-43ab-bdad-bc18cc78c5e5 [ 671.388637] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0d11c41c-99fa-48ff-89f1-6a978cef504d tempest-ServerDiagnosticsV248Test-1186710790 tempest-ServerDiagnosticsV248Test-1186710790-project-member] Lock "d2e78734-c619-43ab-bdad-bc18cc78c5e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 98.792s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 671.389862] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "d2e78734-c619-43ab-bdad-bc18cc78c5e5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 97.391s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 671.390069] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d2e78734-c619-43ab-bdad-bc18cc78c5e5] During sync_power_state the instance has a pending task (spawning). Skip. [ 671.390255] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "d2e78734-c619-43ab-bdad-bc18cc78c5e5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 671.416369] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 671.461891] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 671.462162] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 671.463597] env[69648]: INFO nova.compute.claims [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 671.886775] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88c469bb-4a85-465c-b23b-5389ad635364 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.894557] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4a96a2c-f3fa-4064-93dd-dea0abd367b3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.925685] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5383306b-d8b9-41e6-9eda-d3cead45a964 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.934032] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e29aed1-1516-4962-987f-8bf7fc5616aa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.946794] env[69648]: DEBUG nova.compute.provider_tree [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 671.957424] env[69648]: DEBUG nova.scheduler.client.report [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 671.971572] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.509s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 671.972060] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 672.002761] env[69648]: DEBUG nova.compute.utils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 672.004797] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 672.005013] env[69648]: DEBUG nova.network.neutron [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 672.013412] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 672.078251] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 672.081907] env[69648]: DEBUG nova.policy [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2b06de6e07a48088211d317d070b18b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f4ba7bf91c1544e281527560cebedfb5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 672.105878] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 672.107898] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 672.107898] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 672.107898] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 672.107898] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 672.107898] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 672.108171] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 672.108171] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 672.108171] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 672.108171] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 672.108171] env[69648]: DEBUG nova.virt.hardware [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 672.108832] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3193d86a-d92e-4cef-91c7-efb3d240d5fc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.121017] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27dc0a2c-dae8-418c-aa14-0a363dc51e0c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.651115] env[69648]: DEBUG nova.network.neutron [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Successfully created port: 306d6fc4-42b7-4bee-a4d4-42926c13c6ff {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 673.546098] env[69648]: DEBUG nova.compute.manager [req-e8c3ca71-96f1-48f3-a044-a9eb887cf915 req-99d504fb-f87c-4d4c-a14f-97b85bbce629 service nova] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Received event network-vif-plugged-306d6fc4-42b7-4bee-a4d4-42926c13c6ff {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 673.546328] env[69648]: DEBUG oslo_concurrency.lockutils [req-e8c3ca71-96f1-48f3-a044-a9eb887cf915 req-99d504fb-f87c-4d4c-a14f-97b85bbce629 service nova] Acquiring lock "642ba6f1-b912-4f55-9199-9c98b58ffc1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 673.546538] env[69648]: DEBUG oslo_concurrency.lockutils [req-e8c3ca71-96f1-48f3-a044-a9eb887cf915 req-99d504fb-f87c-4d4c-a14f-97b85bbce629 service nova] Lock "642ba6f1-b912-4f55-9199-9c98b58ffc1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 673.546705] env[69648]: DEBUG oslo_concurrency.lockutils [req-e8c3ca71-96f1-48f3-a044-a9eb887cf915 req-99d504fb-f87c-4d4c-a14f-97b85bbce629 service nova] Lock "642ba6f1-b912-4f55-9199-9c98b58ffc1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.546869] env[69648]: DEBUG nova.compute.manager [req-e8c3ca71-96f1-48f3-a044-a9eb887cf915 req-99d504fb-f87c-4d4c-a14f-97b85bbce629 service nova] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] No waiting events found dispatching network-vif-plugged-306d6fc4-42b7-4bee-a4d4-42926c13c6ff {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 673.547251] env[69648]: WARNING nova.compute.manager [req-e8c3ca71-96f1-48f3-a044-a9eb887cf915 req-99d504fb-f87c-4d4c-a14f-97b85bbce629 service nova] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Received unexpected event network-vif-plugged-306d6fc4-42b7-4bee-a4d4-42926c13c6ff for instance with vm_state building and task_state spawning. [ 673.597756] env[69648]: DEBUG nova.network.neutron [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Successfully updated port: 306d6fc4-42b7-4bee-a4d4-42926c13c6ff {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 673.611401] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "refresh_cache-642ba6f1-b912-4f55-9199-9c98b58ffc1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 673.611561] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired lock "refresh_cache-642ba6f1-b912-4f55-9199-9c98b58ffc1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 673.611713] env[69648]: DEBUG nova.network.neutron [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 673.665887] env[69648]: DEBUG nova.network.neutron [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 673.936043] env[69648]: DEBUG nova.network.neutron [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Updating instance_info_cache with network_info: [{"id": "306d6fc4-42b7-4bee-a4d4-42926c13c6ff", "address": "fa:16:3e:1c:4c:c4", "network": {"id": "07cb66a4-31ee-4795-acb9-03b9394fe3fe", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1187613248-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4ba7bf91c1544e281527560cebedfb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap306d6fc4-42", "ovs_interfaceid": "306d6fc4-42b7-4bee-a4d4-42926c13c6ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.946939] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Releasing lock "refresh_cache-642ba6f1-b912-4f55-9199-9c98b58ffc1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 673.947435] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Instance network_info: |[{"id": "306d6fc4-42b7-4bee-a4d4-42926c13c6ff", "address": "fa:16:3e:1c:4c:c4", "network": {"id": "07cb66a4-31ee-4795-acb9-03b9394fe3fe", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1187613248-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4ba7bf91c1544e281527560cebedfb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap306d6fc4-42", "ovs_interfaceid": "306d6fc4-42b7-4bee-a4d4-42926c13c6ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 673.948094] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1c:4c:c4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7abeeabc-351d-404c-ada6-6a7305667707', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '306d6fc4-42b7-4bee-a4d4-42926c13c6ff', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 673.956065] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating folder: Project (f4ba7bf91c1544e281527560cebedfb5). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 673.956881] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-55c9545e-db8a-4d3e-8df5-b148bc84003d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.967731] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Created folder: Project (f4ba7bf91c1544e281527560cebedfb5) in parent group-v692308. [ 673.967924] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating folder: Instances. Parent ref: group-v692340. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 673.968164] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6fd3250e-0a47-4156-a0a7-89326faff5f3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.977475] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Created folder: Instances in parent group-v692340. [ 673.977710] env[69648]: DEBUG oslo.service.loopingcall [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 673.977895] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 673.978108] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4b8ac34e-36df-4ef9-9105-3b47e10b6fba {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.999133] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 673.999133] env[69648]: value = "task-3466492" [ 673.999133] env[69648]: _type = "Task" [ 673.999133] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 674.006175] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466492, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 674.508856] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466492, 'name': CreateVM_Task, 'duration_secs': 0.322413} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 674.509064] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 674.509928] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 674.510405] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 674.511036] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 674.511324] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-890bf8ea-4d8c-4863-bb28-a9223e4121f8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 674.517913] env[69648]: DEBUG oslo_vmware.api [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 674.517913] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52612f2f-1a69-d0e2-31a1-c94985e0786a" [ 674.517913] env[69648]: _type = "Task" [ 674.517913] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 674.524791] env[69648]: DEBUG oslo_vmware.api [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52612f2f-1a69-d0e2-31a1-c94985e0786a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 674.763196] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquiring lock "62954fe5-a462-40bd-85ec-d03b98d2ec42" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 674.763481] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 675.027372] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 675.027653] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 675.027871] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 675.572341] env[69648]: DEBUG nova.compute.manager [req-00a2aaea-9f46-4bd4-8836-bea71a2afbba req-6cbaf670-6882-47e2-ac3e-2b6840e759fe service nova] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Received event network-changed-306d6fc4-42b7-4bee-a4d4-42926c13c6ff {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 675.572556] env[69648]: DEBUG nova.compute.manager [req-00a2aaea-9f46-4bd4-8836-bea71a2afbba req-6cbaf670-6882-47e2-ac3e-2b6840e759fe service nova] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Refreshing instance network info cache due to event network-changed-306d6fc4-42b7-4bee-a4d4-42926c13c6ff. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 675.572771] env[69648]: DEBUG oslo_concurrency.lockutils [req-00a2aaea-9f46-4bd4-8836-bea71a2afbba req-6cbaf670-6882-47e2-ac3e-2b6840e759fe service nova] Acquiring lock "refresh_cache-642ba6f1-b912-4f55-9199-9c98b58ffc1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 675.572916] env[69648]: DEBUG oslo_concurrency.lockutils [req-00a2aaea-9f46-4bd4-8836-bea71a2afbba req-6cbaf670-6882-47e2-ac3e-2b6840e759fe service nova] Acquired lock "refresh_cache-642ba6f1-b912-4f55-9199-9c98b58ffc1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 675.573088] env[69648]: DEBUG nova.network.neutron [req-00a2aaea-9f46-4bd4-8836-bea71a2afbba req-6cbaf670-6882-47e2-ac3e-2b6840e759fe service nova] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Refreshing network info cache for port 306d6fc4-42b7-4bee-a4d4-42926c13c6ff {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 675.902739] env[69648]: DEBUG nova.network.neutron [req-00a2aaea-9f46-4bd4-8836-bea71a2afbba req-6cbaf670-6882-47e2-ac3e-2b6840e759fe service nova] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Updated VIF entry in instance network info cache for port 306d6fc4-42b7-4bee-a4d4-42926c13c6ff. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 675.903117] env[69648]: DEBUG nova.network.neutron [req-00a2aaea-9f46-4bd4-8836-bea71a2afbba req-6cbaf670-6882-47e2-ac3e-2b6840e759fe service nova] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Updating instance_info_cache with network_info: [{"id": "306d6fc4-42b7-4bee-a4d4-42926c13c6ff", "address": "fa:16:3e:1c:4c:c4", "network": {"id": "07cb66a4-31ee-4795-acb9-03b9394fe3fe", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1187613248-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4ba7bf91c1544e281527560cebedfb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap306d6fc4-42", "ovs_interfaceid": "306d6fc4-42b7-4bee-a4d4-42926c13c6ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 675.912518] env[69648]: DEBUG oslo_concurrency.lockutils [req-00a2aaea-9f46-4bd4-8836-bea71a2afbba req-6cbaf670-6882-47e2-ac3e-2b6840e759fe service nova] Releasing lock "refresh_cache-642ba6f1-b912-4f55-9199-9c98b58ffc1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 703.944567] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 703.944887] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 703.944958] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 703.945098] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 703.965105] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.965275] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 54630c78-200e-4b36-8612-34f411e08821] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.965411] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.965542] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.965670] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.965796] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.965921] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.966173] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.966337] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.966463] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 703.966592] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 703.967078] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 704.064638] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 704.064886] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 705.065086] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 705.065355] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 705.065499] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 706.065479] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 707.065414] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 707.077498] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 707.077787] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 707.077961] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 707.078139] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 707.079320] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-944145f0-6aee-4311-8825-787b358bdf26 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.089184] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-831277c1-f11f-4296-90ea-a847adc1d0b3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.102727] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fcc7009-1379-4b3e-ba3d-7c1cc7a7bcb2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.109468] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5094842-8c08-460f-9c3c-dec61023fd1e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.137315] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180956MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 707.137470] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 707.137687] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 707.225613] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 549c349e-5417-408c-acb2-93e506476e2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.225833] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 54630c78-200e-4b36-8612-34f411e08821 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.226026] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.226107] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ce04d2df-8587-4cda-93b1-cad7ba3ff670 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.226232] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance dfbb396b-8f18-456d-9064-be451cdd1ac9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.226352] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 20bce654-7f57-4de6-8f7a-c1b34286fc86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.226472] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 928bc799-4fed-4005-89d2-e18196f88ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.226589] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ed63f202-c76d-4492-b738-606ee1c6b059 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.226705] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.226820] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 707.237661] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.251019] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.258742] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 741e7ee9-8ee1-4b36-89cc-a640e6f6b97a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.268745] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 28f086a0-3197-47a9-ad8d-6cfd3a59bfc3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.278332] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ca4953f5-de55-4476-a845-c633a073eb43 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.287579] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 51ebee08-c929-4485-b229-cd99d35db2f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.296818] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63c7b328-2c8a-42a6-b340-78528a656f9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.306962] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2b75aa03-750c-49b3-b69a-63bfee58942f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.316488] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b21a0b12-14c7-491f-ae08-f596924490d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.326989] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 267be1e0-7768-421d-9ae4-f8acb5331b23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.337090] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 50dd8f44-95ae-4b0c-ad88-2f11f4886d57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.346761] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 613910dc-5ba5-482f-8b77-bf978fe622dd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.356502] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6e484ac2-6437-488a-97e2-f5dedb5816c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.366015] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e0b188fb-3ec8-46c7-8966-ea4eaef2430b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.375394] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.387055] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c411e626-dff7-4999-8e69-9716f322d518 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.396159] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2c03e5ce-0ebd-40b3-982a-f0f7d4742dde has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.405198] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 72591ff3-6bb9-4b0a-9f38-fa6111f74408 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.414672] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 19e7f1b0-5cd9-453f-8600-a7d76487de87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.423459] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2dd9d8b9-8944-43cf-989b-e07354e29d40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.432528] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance dd0a5705-5745-439a-9fe2-23b852b86c2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.441636] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 12ddb227-b0e5-47cf-92b0-5c7338c1120e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.450196] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 707.450408] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 707.450555] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 707.805586] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac4a2e5d-814b-4e94-888e-a3ac82130283 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.813091] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84d2606b-23ad-4edb-80c0-70f8121fef55 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.842273] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f144028a-416e-4ec4-9803-d05cf456ff64 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.849415] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fc4ffd9-e17c-49cc-a7be-1909cb4ff3e8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.862966] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.875030] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.888390] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 707.889356] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.751s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 719.358288] env[69648]: WARNING oslo_vmware.rw_handles [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 719.358288] env[69648]: ERROR oslo_vmware.rw_handles [ 719.358897] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/68df1c06-6302-4955-81bc-1ad5811df88e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 719.360555] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 719.360823] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Copying Virtual Disk [datastore1] vmware_temp/68df1c06-6302-4955-81bc-1ad5811df88e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/68df1c06-6302-4955-81bc-1ad5811df88e/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 719.361134] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5687c607-c394-478d-8858-3561e0c9a1a0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.369496] env[69648]: DEBUG oslo_vmware.api [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Waiting for the task: (returnval){ [ 719.369496] env[69648]: value = "task-3466493" [ 719.369496] env[69648]: _type = "Task" [ 719.369496] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 719.377775] env[69648]: DEBUG oslo_vmware.api [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Task: {'id': task-3466493, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 719.879738] env[69648]: DEBUG oslo_vmware.exceptions [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 719.880052] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 719.880613] env[69648]: ERROR nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 719.880613] env[69648]: Faults: ['InvalidArgument'] [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] Traceback (most recent call last): [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] yield resources [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] self.driver.spawn(context, instance, image_meta, [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] self._fetch_image_if_missing(context, vi) [ 719.880613] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] image_cache(vi, tmp_image_ds_loc) [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] vm_util.copy_virtual_disk( [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] session._wait_for_task(vmdk_copy_task) [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] return self.wait_for_task(task_ref) [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] return evt.wait() [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] result = hub.switch() [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 719.880895] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] return self.greenlet.switch() [ 719.881520] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 719.881520] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] self.f(*self.args, **self.kw) [ 719.881520] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 719.881520] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] raise exceptions.translate_fault(task_info.error) [ 719.881520] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 719.881520] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] Faults: ['InvalidArgument'] [ 719.881520] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] [ 719.881520] env[69648]: INFO nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Terminating instance [ 719.882441] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 719.882653] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 719.882894] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3ae614c5-c867-4715-a068-71b03e2ad6a5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.885271] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 719.885468] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 719.886246] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ec70775-881d-4ea2-ba52-374d8afd0489 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.893154] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 719.893384] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b3d8a885-2475-430d-9ffe-53e6a64c1a1d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.895739] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 719.895930] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 719.896877] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-95e6d479-d6ae-4849-af5a-bf93d5ab307a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.902091] env[69648]: DEBUG oslo_vmware.api [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Waiting for the task: (returnval){ [ 719.902091] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52b43810-9820-e243-3864-53f2ecf0342f" [ 719.902091] env[69648]: _type = "Task" [ 719.902091] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 719.916017] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 719.916262] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Creating directory with path [datastore1] vmware_temp/cbb90c15-5ef8-45fe-aebc-c691604a60c4/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 719.916472] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7507a3c4-f78a-4a09-bfb7-39825d2a0f26 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.937358] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Created directory with path [datastore1] vmware_temp/cbb90c15-5ef8-45fe-aebc-c691604a60c4/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 719.937564] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Fetch image to [datastore1] vmware_temp/cbb90c15-5ef8-45fe-aebc-c691604a60c4/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 719.937773] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/cbb90c15-5ef8-45fe-aebc-c691604a60c4/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 719.938621] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da6a3aca-4cd2-455e-bfde-dd5c4ec6a55e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.946227] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22179607-e8fe-4b9f-b04c-9e0422d049d8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.955896] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17adc98f-60d7-4bfc-a075-23a585a9926c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.991485] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a103b00-bb3e-4163-98b1-58a78c3988b7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.995027] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 719.995027] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 719.995027] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Deleting the datastore file [datastore1] 549c349e-5417-408c-acb2-93e506476e2a {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 719.995027] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d780c84b-5821-4ab0-8259-bfa977fc2b74 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.000767] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e2553d58-f5bd-434e-b822-fc5b15391a72 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.002489] env[69648]: DEBUG oslo_vmware.api [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Waiting for the task: (returnval){ [ 720.002489] env[69648]: value = "task-3466495" [ 720.002489] env[69648]: _type = "Task" [ 720.002489] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 720.010016] env[69648]: DEBUG oslo_vmware.api [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Task: {'id': task-3466495, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 720.020378] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 720.075898] env[69648]: DEBUG oslo_vmware.rw_handles [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cbb90c15-5ef8-45fe-aebc-c691604a60c4/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 720.136523] env[69648]: DEBUG oslo_vmware.rw_handles [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 720.136720] env[69648]: DEBUG oslo_vmware.rw_handles [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cbb90c15-5ef8-45fe-aebc-c691604a60c4/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 720.512849] env[69648]: DEBUG oslo_vmware.api [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Task: {'id': task-3466495, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067522} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 720.513186] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 720.513401] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 720.513580] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 720.513757] env[69648]: INFO nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Took 0.63 seconds to destroy the instance on the hypervisor. [ 720.516236] env[69648]: DEBUG nova.compute.claims [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 720.516421] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 720.516645] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 720.967896] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-982fbdcd-51d0-4481-8374-3cf963e1c745 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.975808] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ac98026-4fa6-451d-b27e-cb7579ec3391 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.005469] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad0fbc46-3bc9-428e-a032-2b32d824eac5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.012205] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d17950a6-2cfd-45c1-b149-4d10d6ad9361 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.025847] env[69648]: DEBUG nova.compute.provider_tree [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 721.035478] env[69648]: DEBUG nova.scheduler.client.report [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 721.048409] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.532s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 721.048949] env[69648]: ERROR nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 721.048949] env[69648]: Faults: ['InvalidArgument'] [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] Traceback (most recent call last): [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] self.driver.spawn(context, instance, image_meta, [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] self._fetch_image_if_missing(context, vi) [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] image_cache(vi, tmp_image_ds_loc) [ 721.048949] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] vm_util.copy_virtual_disk( [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] session._wait_for_task(vmdk_copy_task) [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] return self.wait_for_task(task_ref) [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] return evt.wait() [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] result = hub.switch() [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] return self.greenlet.switch() [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 721.049297] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] self.f(*self.args, **self.kw) [ 721.049645] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 721.049645] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] raise exceptions.translate_fault(task_info.error) [ 721.049645] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 721.049645] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] Faults: ['InvalidArgument'] [ 721.049645] env[69648]: ERROR nova.compute.manager [instance: 549c349e-5417-408c-acb2-93e506476e2a] [ 721.049645] env[69648]: DEBUG nova.compute.utils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 721.051063] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Build of instance 549c349e-5417-408c-acb2-93e506476e2a was re-scheduled: A specified parameter was not correct: fileType [ 721.051063] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 721.051439] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 721.051611] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 721.051767] env[69648]: DEBUG nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 721.051930] env[69648]: DEBUG nova.network.neutron [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 721.396517] env[69648]: DEBUG nova.network.neutron [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.414330] env[69648]: INFO nova.compute.manager [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] [instance: 549c349e-5417-408c-acb2-93e506476e2a] Took 0.36 seconds to deallocate network for instance. [ 721.517952] env[69648]: INFO nova.scheduler.client.report [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Deleted allocations for instance 549c349e-5417-408c-acb2-93e506476e2a [ 721.542065] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0000ba4d-4b01-4b9b-8d56-30d0913bb129 tempest-ServerDiagnosticsTest-1004028490 tempest-ServerDiagnosticsTest-1004028490-project-member] Lock "549c349e-5417-408c-acb2-93e506476e2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 149.960s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 721.543279] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "549c349e-5417-408c-acb2-93e506476e2a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 147.544s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 721.543468] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 549c349e-5417-408c-acb2-93e506476e2a] During sync_power_state the instance has a pending task (spawning). Skip. [ 721.543641] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "549c349e-5417-408c-acb2-93e506476e2a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 721.557900] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 721.611865] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 721.612187] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 721.613634] env[69648]: INFO nova.compute.claims [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 722.043859] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c60e3d9-a65b-4dd5-926d-06fbeb18e42d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.051468] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cdf5f0c-ac64-4029-97e1-f8a5fdfb0496 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.083922] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15372841-c755-470e-becb-2c8d89453e22 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.089721] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48746b6f-04e6-4c16-8d04-d9d4673eb4b7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.103081] env[69648]: DEBUG nova.compute.provider_tree [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 722.111970] env[69648]: DEBUG nova.scheduler.client.report [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 722.129540] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.517s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 722.130107] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 722.165175] env[69648]: DEBUG nova.compute.utils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 722.167391] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 722.167606] env[69648]: DEBUG nova.network.neutron [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 722.178533] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 722.250701] env[69648]: DEBUG nova.policy [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2b06de6e07a48088211d317d070b18b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f4ba7bf91c1544e281527560cebedfb5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 722.256118] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 722.282940] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 722.283195] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 722.283356] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 722.283541] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 722.283690] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 722.283840] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 722.284082] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 722.284255] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 722.284422] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 722.284585] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 722.284758] env[69648]: DEBUG nova.virt.hardware [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 722.285631] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-164f4a95-998b-4a78-b76d-267491f10fcb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.293834] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b06f1415-b2fd-4f27-b82d-b113f9f96bf5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.678288] env[69648]: DEBUG nova.network.neutron [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Successfully created port: 12ab515f-09b3-4f85-a7ba-2b13c728d72e {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 723.620441] env[69648]: DEBUG nova.network.neutron [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Successfully updated port: 12ab515f-09b3-4f85-a7ba-2b13c728d72e {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 723.635690] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "refresh_cache-91fcee48-3466-480d-bf87-bc4de17fbf31" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 723.636048] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired lock "refresh_cache-91fcee48-3466-480d-bf87-bc4de17fbf31" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 723.636284] env[69648]: DEBUG nova.network.neutron [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 723.687377] env[69648]: DEBUG nova.network.neutron [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 723.692207] env[69648]: DEBUG nova.compute.manager [req-002ab5c4-7705-4f7e-b55e-2fde2825e503 req-f3253015-a1b7-496e-bbf2-47021910f6b7 service nova] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Received event network-vif-plugged-12ab515f-09b3-4f85-a7ba-2b13c728d72e {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 723.692207] env[69648]: DEBUG oslo_concurrency.lockutils [req-002ab5c4-7705-4f7e-b55e-2fde2825e503 req-f3253015-a1b7-496e-bbf2-47021910f6b7 service nova] Acquiring lock "91fcee48-3466-480d-bf87-bc4de17fbf31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 723.692207] env[69648]: DEBUG oslo_concurrency.lockutils [req-002ab5c4-7705-4f7e-b55e-2fde2825e503 req-f3253015-a1b7-496e-bbf2-47021910f6b7 service nova] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 723.692346] env[69648]: DEBUG oslo_concurrency.lockutils [req-002ab5c4-7705-4f7e-b55e-2fde2825e503 req-f3253015-a1b7-496e-bbf2-47021910f6b7 service nova] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.692802] env[69648]: DEBUG nova.compute.manager [req-002ab5c4-7705-4f7e-b55e-2fde2825e503 req-f3253015-a1b7-496e-bbf2-47021910f6b7 service nova] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] No waiting events found dispatching network-vif-plugged-12ab515f-09b3-4f85-a7ba-2b13c728d72e {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 723.692802] env[69648]: WARNING nova.compute.manager [req-002ab5c4-7705-4f7e-b55e-2fde2825e503 req-f3253015-a1b7-496e-bbf2-47021910f6b7 service nova] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Received unexpected event network-vif-plugged-12ab515f-09b3-4f85-a7ba-2b13c728d72e for instance with vm_state building and task_state spawning. [ 723.926660] env[69648]: DEBUG nova.network.neutron [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Updating instance_info_cache with network_info: [{"id": "12ab515f-09b3-4f85-a7ba-2b13c728d72e", "address": "fa:16:3e:cd:4d:a9", "network": {"id": "07cb66a4-31ee-4795-acb9-03b9394fe3fe", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1187613248-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4ba7bf91c1544e281527560cebedfb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap12ab515f-09", "ovs_interfaceid": "12ab515f-09b3-4f85-a7ba-2b13c728d72e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.942858] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Releasing lock "refresh_cache-91fcee48-3466-480d-bf87-bc4de17fbf31" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 723.942858] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Instance network_info: |[{"id": "12ab515f-09b3-4f85-a7ba-2b13c728d72e", "address": "fa:16:3e:cd:4d:a9", "network": {"id": "07cb66a4-31ee-4795-acb9-03b9394fe3fe", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1187613248-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4ba7bf91c1544e281527560cebedfb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap12ab515f-09", "ovs_interfaceid": "12ab515f-09b3-4f85-a7ba-2b13c728d72e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 723.943050] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cd:4d:a9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7abeeabc-351d-404c-ada6-6a7305667707', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '12ab515f-09b3-4f85-a7ba-2b13c728d72e', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 723.950727] env[69648]: DEBUG oslo.service.loopingcall [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 723.951999] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 723.951999] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e9f31094-4829-4597-88da-1c4e83c66861 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.972210] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 723.972210] env[69648]: value = "task-3466496" [ 723.972210] env[69648]: _type = "Task" [ 723.972210] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 723.980034] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466496, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 724.483939] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466496, 'name': CreateVM_Task, 'duration_secs': 0.2949} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 724.484132] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 724.484759] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 724.484923] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 724.485274] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 724.485517] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4bf347e5-5bba-445a-bce8-fee5590bd56d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.489824] env[69648]: DEBUG oslo_vmware.api [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 724.489824] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52b1d5de-3ec3-f85b-f8af-d56dfe7d6f9e" [ 724.489824] env[69648]: _type = "Task" [ 724.489824] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 724.496796] env[69648]: DEBUG oslo_vmware.api [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52b1d5de-3ec3-f85b-f8af-d56dfe7d6f9e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 725.001943] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 725.003671] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 725.003918] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 725.806351] env[69648]: DEBUG nova.compute.manager [req-22a75d25-b0a8-4664-b58c-901926e8629a req-2d08f938-c299-413f-ab78-518ce43eaaaf service nova] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Received event network-changed-12ab515f-09b3-4f85-a7ba-2b13c728d72e {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 725.806562] env[69648]: DEBUG nova.compute.manager [req-22a75d25-b0a8-4664-b58c-901926e8629a req-2d08f938-c299-413f-ab78-518ce43eaaaf service nova] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Refreshing instance network info cache due to event network-changed-12ab515f-09b3-4f85-a7ba-2b13c728d72e. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 725.806774] env[69648]: DEBUG oslo_concurrency.lockutils [req-22a75d25-b0a8-4664-b58c-901926e8629a req-2d08f938-c299-413f-ab78-518ce43eaaaf service nova] Acquiring lock "refresh_cache-91fcee48-3466-480d-bf87-bc4de17fbf31" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 725.806922] env[69648]: DEBUG oslo_concurrency.lockutils [req-22a75d25-b0a8-4664-b58c-901926e8629a req-2d08f938-c299-413f-ab78-518ce43eaaaf service nova] Acquired lock "refresh_cache-91fcee48-3466-480d-bf87-bc4de17fbf31" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 725.808104] env[69648]: DEBUG nova.network.neutron [req-22a75d25-b0a8-4664-b58c-901926e8629a req-2d08f938-c299-413f-ab78-518ce43eaaaf service nova] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Refreshing network info cache for port 12ab515f-09b3-4f85-a7ba-2b13c728d72e {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 726.541283] env[69648]: DEBUG nova.network.neutron [req-22a75d25-b0a8-4664-b58c-901926e8629a req-2d08f938-c299-413f-ab78-518ce43eaaaf service nova] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Updated VIF entry in instance network info cache for port 12ab515f-09b3-4f85-a7ba-2b13c728d72e. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 726.542422] env[69648]: DEBUG nova.network.neutron [req-22a75d25-b0a8-4664-b58c-901926e8629a req-2d08f938-c299-413f-ab78-518ce43eaaaf service nova] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Updating instance_info_cache with network_info: [{"id": "12ab515f-09b3-4f85-a7ba-2b13c728d72e", "address": "fa:16:3e:cd:4d:a9", "network": {"id": "07cb66a4-31ee-4795-acb9-03b9394fe3fe", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1187613248-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4ba7bf91c1544e281527560cebedfb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap12ab515f-09", "ovs_interfaceid": "12ab515f-09b3-4f85-a7ba-2b13c728d72e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.554225] env[69648]: DEBUG oslo_concurrency.lockutils [req-22a75d25-b0a8-4664-b58c-901926e8629a req-2d08f938-c299-413f-ab78-518ce43eaaaf service nova] Releasing lock "refresh_cache-91fcee48-3466-480d-bf87-bc4de17fbf31" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 728.711316] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquiring lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.711316] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 763.883602] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 764.065265] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 764.065503] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 764.065694] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 765.060769] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 765.064302] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 765.064472] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 765.064595] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 765.084455] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 54630c78-200e-4b36-8612-34f411e08821] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.084556] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.084692] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.084821] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.084947] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.085093] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.085221] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.085343] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.085462] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.085581] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 765.085703] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 766.064746] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 767.070033] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 767.070033] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 767.070033] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 767.070033] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 767.080578] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 767.080832] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 767.081012] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 767.081191] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 767.082712] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daf5197d-a06c-4610-a317-860c4b4fffff {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.095237] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fe6bd99-0f97-48ce-9c4c-36c0d9d1eb45 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.113023] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-488eaed3-c96d-420b-bee2-81bd61de9d3b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.115938] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d639e383-6ecc-4deb-b2d6-4c3baeb29690 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.144902] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180983MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 767.145066] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 767.145363] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 767.222187] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 54630c78-200e-4b36-8612-34f411e08821 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.222356] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.222558] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ce04d2df-8587-4cda-93b1-cad7ba3ff670 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.222720] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance dfbb396b-8f18-456d-9064-be451cdd1ac9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.222854] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 20bce654-7f57-4de6-8f7a-c1b34286fc86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.222960] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 928bc799-4fed-4005-89d2-e18196f88ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.223092] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ed63f202-c76d-4492-b738-606ee1c6b059 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.223211] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.223333] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.223447] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 767.234978] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.245549] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 741e7ee9-8ee1-4b36-89cc-a640e6f6b97a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.255567] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 28f086a0-3197-47a9-ad8d-6cfd3a59bfc3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.265975] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ca4953f5-de55-4476-a845-c633a073eb43 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.277266] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 51ebee08-c929-4485-b229-cd99d35db2f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.287132] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63c7b328-2c8a-42a6-b340-78528a656f9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.296982] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2b75aa03-750c-49b3-b69a-63bfee58942f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.307033] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b21a0b12-14c7-491f-ae08-f596924490d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.317026] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 267be1e0-7768-421d-9ae4-f8acb5331b23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.326554] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 50dd8f44-95ae-4b0c-ad88-2f11f4886d57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.335421] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 613910dc-5ba5-482f-8b77-bf978fe622dd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.344868] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6e484ac2-6437-488a-97e2-f5dedb5816c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.353658] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e0b188fb-3ec8-46c7-8966-ea4eaef2430b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.362015] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.370408] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c411e626-dff7-4999-8e69-9716f322d518 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.379382] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2c03e5ce-0ebd-40b3-982a-f0f7d4742dde has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.388531] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 72591ff3-6bb9-4b0a-9f38-fa6111f74408 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.397677] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 19e7f1b0-5cd9-453f-8600-a7d76487de87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.406037] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2dd9d8b9-8944-43cf-989b-e07354e29d40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.414993] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance dd0a5705-5745-439a-9fe2-23b852b86c2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.424721] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 12ddb227-b0e5-47cf-92b0-5c7338c1120e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.433474] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.442225] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 767.442480] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 767.442672] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 767.819266] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c5aba5a-b486-482d-832d-8ca8650445c8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.827025] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c26a23a8-fedc-42f2-b4e8-5b0491f24431 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.858394] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f62c332a-ebea-49bb-9405-fcd6b11d8af0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.866345] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-617d1d09-c8ab-45d6-99f5-fab7dd2eb1ce {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.881047] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 767.890472] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 767.906766] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 767.906979] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.762s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 769.342057] env[69648]: WARNING oslo_vmware.rw_handles [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 769.342057] env[69648]: ERROR oslo_vmware.rw_handles [ 769.342057] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/cbb90c15-5ef8-45fe-aebc-c691604a60c4/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 769.343148] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 769.343404] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Copying Virtual Disk [datastore1] vmware_temp/cbb90c15-5ef8-45fe-aebc-c691604a60c4/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/cbb90c15-5ef8-45fe-aebc-c691604a60c4/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 769.343677] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c49ccf41-0c77-4c9f-84f7-08cae3e51eaa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.351315] env[69648]: DEBUG oslo_vmware.api [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Waiting for the task: (returnval){ [ 769.351315] env[69648]: value = "task-3466497" [ 769.351315] env[69648]: _type = "Task" [ 769.351315] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 769.359982] env[69648]: DEBUG oslo_vmware.api [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Task: {'id': task-3466497, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 769.862141] env[69648]: DEBUG oslo_vmware.exceptions [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 769.862425] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 769.862980] env[69648]: ERROR nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 769.862980] env[69648]: Faults: ['InvalidArgument'] [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] Traceback (most recent call last): [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] yield resources [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] self.driver.spawn(context, instance, image_meta, [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] self._vmops.spawn(context, instance, image_meta, injected_files, [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] self._fetch_image_if_missing(context, vi) [ 769.862980] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] image_cache(vi, tmp_image_ds_loc) [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] vm_util.copy_virtual_disk( [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] session._wait_for_task(vmdk_copy_task) [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] return self.wait_for_task(task_ref) [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] return evt.wait() [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] result = hub.switch() [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 769.863329] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] return self.greenlet.switch() [ 769.863725] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 769.863725] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] self.f(*self.args, **self.kw) [ 769.863725] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 769.863725] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] raise exceptions.translate_fault(task_info.error) [ 769.863725] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 769.863725] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] Faults: ['InvalidArgument'] [ 769.863725] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] [ 769.863725] env[69648]: INFO nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Terminating instance [ 769.864830] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 769.865046] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 769.865285] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8e03731b-145f-4199-8457-fcf175155bf9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.867501] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 769.867689] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 769.868488] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ffa9ccc-452a-408c-8a48-7e58b0acbf7e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.875647] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 769.875760] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c112d5a6-76ff-4edd-9d2b-40b1dfc34f2c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.877995] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 769.878223] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 769.879408] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-01034bb3-c7e8-48b5-82d1-a3a7f53952fa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.884211] env[69648]: DEBUG oslo_vmware.api [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Waiting for the task: (returnval){ [ 769.884211] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52e10771-b265-5860-bf9d-d9f4c3445cbb" [ 769.884211] env[69648]: _type = "Task" [ 769.884211] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 769.891530] env[69648]: DEBUG oslo_vmware.api [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52e10771-b265-5860-bf9d-d9f4c3445cbb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 769.940271] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 769.940521] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 769.940721] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Deleting the datastore file [datastore1] 54630c78-200e-4b36-8612-34f411e08821 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 769.941007] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-53020f74-a244-4526-bcc0-5b0fcd0ce099 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.946781] env[69648]: DEBUG oslo_vmware.api [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Waiting for the task: (returnval){ [ 769.946781] env[69648]: value = "task-3466499" [ 769.946781] env[69648]: _type = "Task" [ 769.946781] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 769.954164] env[69648]: DEBUG oslo_vmware.api [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Task: {'id': task-3466499, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 770.395636] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 770.395981] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Creating directory with path [datastore1] vmware_temp/52c21a2a-f8ae-47cd-99c0-95ab1d22fea7/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 770.396193] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9024b6e2-7035-4601-bbba-ee6edb51aadb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.407775] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Created directory with path [datastore1] vmware_temp/52c21a2a-f8ae-47cd-99c0-95ab1d22fea7/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 770.407956] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Fetch image to [datastore1] vmware_temp/52c21a2a-f8ae-47cd-99c0-95ab1d22fea7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 770.408142] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/52c21a2a-f8ae-47cd-99c0-95ab1d22fea7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 770.408905] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0faebdf-fd6f-4c58-8d08-59d9f1973773 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.415448] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d84ca078-7153-42ee-8110-d66151672cba {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.424412] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6c6dbb9-8384-400e-9e2c-3f6a2d28c059 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.457902] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d42bf0f-af09-4ec6-a509-e433b1121fae {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.464702] env[69648]: DEBUG oslo_vmware.api [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Task: {'id': task-3466499, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073622} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 770.466179] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 770.466375] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 770.466546] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 770.466719] env[69648]: INFO nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Took 0.60 seconds to destroy the instance on the hypervisor. [ 770.468489] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-26d60bb7-a264-4b7f-a980-371eef413f32 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.470463] env[69648]: DEBUG nova.compute.claims [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 770.470635] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 770.470848] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 770.560978] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 770.631388] env[69648]: DEBUG oslo_vmware.rw_handles [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/52c21a2a-f8ae-47cd-99c0-95ab1d22fea7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 770.695914] env[69648]: DEBUG oslo_vmware.rw_handles [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 770.696118] env[69648]: DEBUG oslo_vmware.rw_handles [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/52c21a2a-f8ae-47cd-99c0-95ab1d22fea7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 770.928597] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ad307fd-0d6c-4a13-b5f8-804f62966f72 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.936566] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aaf2135-743c-4050-a427-2e56b7c00536 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.966253] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d3d4d16-0777-44de-b744-1f2519cc5c64 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.973423] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0f24da7-dcbc-484d-8e4c-4cc399b240ca {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.987537] env[69648]: DEBUG nova.compute.provider_tree [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 770.995667] env[69648]: DEBUG nova.scheduler.client.report [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 771.008723] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.538s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 771.009300] env[69648]: ERROR nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 771.009300] env[69648]: Faults: ['InvalidArgument'] [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] Traceback (most recent call last): [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] self.driver.spawn(context, instance, image_meta, [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] self._vmops.spawn(context, instance, image_meta, injected_files, [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] self._fetch_image_if_missing(context, vi) [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] image_cache(vi, tmp_image_ds_loc) [ 771.009300] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] vm_util.copy_virtual_disk( [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] session._wait_for_task(vmdk_copy_task) [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] return self.wait_for_task(task_ref) [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] return evt.wait() [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] result = hub.switch() [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] return self.greenlet.switch() [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 771.009690] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] self.f(*self.args, **self.kw) [ 771.009987] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 771.009987] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] raise exceptions.translate_fault(task_info.error) [ 771.009987] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 771.009987] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] Faults: ['InvalidArgument'] [ 771.009987] env[69648]: ERROR nova.compute.manager [instance: 54630c78-200e-4b36-8612-34f411e08821] [ 771.009987] env[69648]: DEBUG nova.compute.utils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 771.012026] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Build of instance 54630c78-200e-4b36-8612-34f411e08821 was re-scheduled: A specified parameter was not correct: fileType [ 771.012026] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 771.012026] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 771.012181] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 771.012318] env[69648]: DEBUG nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 771.013023] env[69648]: DEBUG nova.network.neutron [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 771.346187] env[69648]: DEBUG nova.network.neutron [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 771.357707] env[69648]: INFO nova.compute.manager [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] [instance: 54630c78-200e-4b36-8612-34f411e08821] Took 0.35 seconds to deallocate network for instance. [ 771.496768] env[69648]: DEBUG oslo_concurrency.lockutils [None req-28209380-6d0f-4662-8ad1-8c0fab5354f2 tempest-TenantUsagesTestJSON-2137585408 tempest-TenantUsagesTestJSON-2137585408-project-member] Lock "54630c78-200e-4b36-8612-34f411e08821" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.683s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 771.497472] env[69648]: Traceback (most recent call last): [ 771.497512] env[69648]: File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 771.497512] env[69648]: self.driver.spawn(context, instance, image_meta, [ 771.497512] env[69648]: File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 771.497512] env[69648]: self._vmops.spawn(context, instance, image_meta, injected_files, [ 771.497512] env[69648]: File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 771.497512] env[69648]: self._fetch_image_if_missing(context, vi) [ 771.497512] env[69648]: File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 771.497512] env[69648]: image_cache(vi, tmp_image_ds_loc) [ 771.497512] env[69648]: File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 771.497512] env[69648]: vm_util.copy_virtual_disk( [ 771.497512] env[69648]: File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 771.497512] env[69648]: session._wait_for_task(vmdk_copy_task) [ 771.497512] env[69648]: File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 771.497512] env[69648]: return self.wait_for_task(task_ref) [ 771.497512] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 771.497512] env[69648]: return evt.wait() [ 771.497512] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 771.497512] env[69648]: result = hub.switch() [ 771.497512] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 771.497512] env[69648]: return self.greenlet.switch() [ 771.498035] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 771.498035] env[69648]: self.f(*self.args, **self.kw) [ 771.498035] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 771.498035] env[69648]: raise exceptions.translate_fault(task_info.error) [ 771.498035] env[69648]: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 771.498035] env[69648]: Faults: ['InvalidArgument'] [ 771.498035] env[69648]: During handling of the above exception, another exception occurred: [ 771.498035] env[69648]: Traceback (most recent call last): [ 771.498035] env[69648]: File "/opt/stack/nova/nova/compute/manager.py", line 2431, in _do_build_and_run_instance [ 771.498035] env[69648]: self._build_and_run_instance(context, instance, image, [ 771.498035] env[69648]: File "/opt/stack/nova/nova/compute/manager.py", line 2723, in _build_and_run_instance [ 771.498035] env[69648]: raise exception.RescheduledException( [ 771.498035] env[69648]: nova.exception.RescheduledException: Build of instance 54630c78-200e-4b36-8612-34f411e08821 was re-scheduled: A specified parameter was not correct: fileType [ 771.498035] env[69648]: Faults: ['InvalidArgument'] [ 771.498035] env[69648]: During handling of the above exception, another exception occurred: [ 771.498035] env[69648]: Traceback (most recent call last): [ 771.498035] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenpool.py", line 88, in _spawn_n_impl [ 771.498035] env[69648]: func(*args, **kwargs) [ 771.498035] env[69648]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 771.498035] env[69648]: return func(*args, **kwargs) [ 771.498035] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 771.498035] env[69648]: return f(*args, **kwargs) [ 771.498035] env[69648]: File "/opt/stack/nova/nova/compute/manager.py", line 2322, in _locked_do_build_and_run_instance [ 771.498035] env[69648]: result = self._do_build_and_run_instance(*args, **kwargs) [ 771.498035] env[69648]: File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 771.498035] env[69648]: with excutils.save_and_reraise_exception(): [ 771.498035] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 771.498035] env[69648]: self.force_reraise() [ 771.498035] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 771.498035] env[69648]: raise self.value [ 771.498987] env[69648]: File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 771.498987] env[69648]: return f(self, context, *args, **kw) [ 771.498987] env[69648]: File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 771.498987] env[69648]: with excutils.save_and_reraise_exception(): [ 771.498987] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 771.498987] env[69648]: self.force_reraise() [ 771.498987] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 771.498987] env[69648]: raise self.value [ 771.498987] env[69648]: File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 771.498987] env[69648]: return function(self, context, *args, **kwargs) [ 771.498987] env[69648]: File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 771.498987] env[69648]: return function(self, context, *args, **kwargs) [ 771.498987] env[69648]: File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 771.498987] env[69648]: return function(self, context, *args, **kwargs) [ 771.498987] env[69648]: File "/opt/stack/nova/nova/compute/manager.py", line 2466, in _do_build_and_run_instance [ 771.498987] env[69648]: instance.save() [ 771.498987] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 771.498987] env[69648]: updates, result = self.indirection_api.object_action( [ 771.498987] env[69648]: File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 771.498987] env[69648]: return cctxt.call(context, 'object_action', objinst=objinst, [ 771.498987] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/client.py", line 190, in call [ 771.498987] env[69648]: result = self.transport._send( [ 771.498987] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/transport.py", line 123, in _send [ 771.498987] env[69648]: return self._driver.send(target, ctxt, message, [ 771.498987] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 771.498987] env[69648]: return self._send(target, ctxt, message, wait_for_reply, timeout, [ 771.498987] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 771.498987] env[69648]: raise result [ 771.500040] env[69648]: nova.exception_Remote.InstanceNotFound_Remote: Instance 54630c78-200e-4b36-8612-34f411e08821 could not be found. [ 771.500040] env[69648]: Traceback (most recent call last): [ 771.500040] env[69648]: File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 771.500040] env[69648]: return getattr(target, method)(*args, **kwargs) [ 771.500040] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 771.500040] env[69648]: return fn(self, *args, **kwargs) [ 771.500040] env[69648]: File "/opt/stack/nova/nova/objects/instance.py", line 884, in save [ 771.500040] env[69648]: old_ref, inst_ref = db.instance_update_and_get_original( [ 771.500040] env[69648]: File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 771.500040] env[69648]: return f(*args, **kwargs) [ 771.500040] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/api.py", line 144, in wrapper [ 771.500040] env[69648]: with excutils.save_and_reraise_exception() as ectxt: [ 771.500040] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 771.500040] env[69648]: self.force_reraise() [ 771.500040] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 771.500040] env[69648]: raise self.value [ 771.500040] env[69648]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/api.py", line 142, in wrapper [ 771.500040] env[69648]: return f(*args, **kwargs) [ 771.500040] env[69648]: File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 771.500040] env[69648]: return f(context, *args, **kwargs) [ 771.500040] env[69648]: File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 771.500040] env[69648]: instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 771.500040] env[69648]: File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 771.500040] env[69648]: raise exception.InstanceNotFound(instance_id=uuid) [ 771.500040] env[69648]: nova.exception.InstanceNotFound: Instance 54630c78-200e-4b36-8612-34f411e08821 could not be found. [ 771.500685] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "54630c78-200e-4b36-8612-34f411e08821" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 197.500s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 771.500685] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 54630c78-200e-4b36-8612-34f411e08821] During sync_power_state the instance has a pending task (spawning). Skip. [ 771.500685] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "54630c78-200e-4b36-8612-34f411e08821" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 771.516445] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 771.575726] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 771.576024] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 771.577646] env[69648]: INFO nova.compute.claims [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 772.047122] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a5ac457-ba52-4a36-87b1-3997c4f4fdb0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.054851] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ae69551-5d02-4b7b-ad9d-0750f233ff23 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.088308] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66b882b6-be43-4fe7-8bca-5a44e352768f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.099379] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-084b6a5f-cac4-45e3-a1ed-217f5713d21c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.105844] env[69648]: DEBUG oslo_concurrency.lockutils [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquiring lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.115988] env[69648]: DEBUG nova.compute.provider_tree [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 772.124847] env[69648]: DEBUG nova.scheduler.client.report [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 772.139579] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.563s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.140093] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 772.181939] env[69648]: DEBUG nova.compute.utils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 772.183481] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 772.183577] env[69648]: DEBUG nova.network.neutron [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 772.193380] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 772.261297] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 772.286875] env[69648]: DEBUG nova.policy [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2b06de6e07a48088211d317d070b18b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f4ba7bf91c1544e281527560cebedfb5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 772.295414] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 772.295660] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 772.295819] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 772.296008] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 772.296245] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 772.296778] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 772.296778] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 772.296778] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 772.296951] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 772.297119] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 772.297295] env[69648]: DEBUG nova.virt.hardware [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 772.298492] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ac6464b-e941-4c67-afaa-57afd06d0b70 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.307102] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-462a5e78-79ce-45b6-90a0-a105bcb04398 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.635394] env[69648]: DEBUG nova.network.neutron [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Successfully created port: e926a684-f92e-4292-a834-1dc7af7ad0ae {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 774.071198] env[69648]: DEBUG nova.compute.manager [req-0a6710e8-8329-4030-be77-2ecd7d38e902 req-d84d4844-f0d3-4bfb-b001-f94480154bfe service nova] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Received event network-vif-plugged-e926a684-f92e-4292-a834-1dc7af7ad0ae {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 774.071518] env[69648]: DEBUG oslo_concurrency.lockutils [req-0a6710e8-8329-4030-be77-2ecd7d38e902 req-d84d4844-f0d3-4bfb-b001-f94480154bfe service nova] Acquiring lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 774.071808] env[69648]: DEBUG oslo_concurrency.lockutils [req-0a6710e8-8329-4030-be77-2ecd7d38e902 req-d84d4844-f0d3-4bfb-b001-f94480154bfe service nova] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 774.071943] env[69648]: DEBUG oslo_concurrency.lockutils [req-0a6710e8-8329-4030-be77-2ecd7d38e902 req-d84d4844-f0d3-4bfb-b001-f94480154bfe service nova] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.072109] env[69648]: DEBUG nova.compute.manager [req-0a6710e8-8329-4030-be77-2ecd7d38e902 req-d84d4844-f0d3-4bfb-b001-f94480154bfe service nova] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] No waiting events found dispatching network-vif-plugged-e926a684-f92e-4292-a834-1dc7af7ad0ae {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 774.072348] env[69648]: WARNING nova.compute.manager [req-0a6710e8-8329-4030-be77-2ecd7d38e902 req-d84d4844-f0d3-4bfb-b001-f94480154bfe service nova] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Received unexpected event network-vif-plugged-e926a684-f92e-4292-a834-1dc7af7ad0ae for instance with vm_state building and task_state spawning. [ 774.190934] env[69648]: DEBUG nova.network.neutron [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Successfully updated port: e926a684-f92e-4292-a834-1dc7af7ad0ae {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 774.204696] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "refresh_cache-45ccc6ec-6501-4477-9b94-1c0e3d1271d9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 774.204848] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired lock "refresh_cache-45ccc6ec-6501-4477-9b94-1c0e3d1271d9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 774.205013] env[69648]: DEBUG nova.network.neutron [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 774.265715] env[69648]: DEBUG nova.network.neutron [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 774.746957] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquiring lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 774.749941] env[69648]: DEBUG nova.network.neutron [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Updating instance_info_cache with network_info: [{"id": "e926a684-f92e-4292-a834-1dc7af7ad0ae", "address": "fa:16:3e:2d:ea:e7", "network": {"id": "07cb66a4-31ee-4795-acb9-03b9394fe3fe", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1187613248-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4ba7bf91c1544e281527560cebedfb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape926a684-f9", "ovs_interfaceid": "e926a684-f92e-4292-a834-1dc7af7ad0ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 774.762082] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Releasing lock "refresh_cache-45ccc6ec-6501-4477-9b94-1c0e3d1271d9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 774.762380] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Instance network_info: |[{"id": "e926a684-f92e-4292-a834-1dc7af7ad0ae", "address": "fa:16:3e:2d:ea:e7", "network": {"id": "07cb66a4-31ee-4795-acb9-03b9394fe3fe", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1187613248-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4ba7bf91c1544e281527560cebedfb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape926a684-f9", "ovs_interfaceid": "e926a684-f92e-4292-a834-1dc7af7ad0ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 774.762751] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2d:ea:e7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7abeeabc-351d-404c-ada6-6a7305667707', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e926a684-f92e-4292-a834-1dc7af7ad0ae', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 774.771238] env[69648]: DEBUG oslo.service.loopingcall [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 774.772137] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 774.772393] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-89f0096a-89cd-4ee3-b8c2-abf224625f0c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.793124] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 774.793124] env[69648]: value = "task-3466500" [ 774.793124] env[69648]: _type = "Task" [ 774.793124] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 775.307545] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466500, 'name': CreateVM_Task} progress is 99%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 775.806198] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466500, 'name': CreateVM_Task} progress is 99%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 776.308080] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466500, 'name': CreateVM_Task, 'duration_secs': 1.345889} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 776.309828] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 776.309828] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 776.309828] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 776.311576] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 776.311576] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d1d8af6e-06f7-433a-8e73-e0adc37de9db {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.318022] env[69648]: DEBUG oslo_vmware.api [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 776.318022] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c6f2c1-f551-c9da-b8dd-c21088500ab1" [ 776.318022] env[69648]: _type = "Task" [ 776.318022] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 776.328702] env[69648]: DEBUG oslo_vmware.api [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c6f2c1-f551-c9da-b8dd-c21088500ab1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 776.406647] env[69648]: DEBUG nova.compute.manager [req-5169f149-4ac1-4ebf-a2fb-6817116aa5e9 req-db07f232-f06e-46cc-a248-002812195cca service nova] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Received event network-changed-e926a684-f92e-4292-a834-1dc7af7ad0ae {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 776.406847] env[69648]: DEBUG nova.compute.manager [req-5169f149-4ac1-4ebf-a2fb-6817116aa5e9 req-db07f232-f06e-46cc-a248-002812195cca service nova] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Refreshing instance network info cache due to event network-changed-e926a684-f92e-4292-a834-1dc7af7ad0ae. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 776.407085] env[69648]: DEBUG oslo_concurrency.lockutils [req-5169f149-4ac1-4ebf-a2fb-6817116aa5e9 req-db07f232-f06e-46cc-a248-002812195cca service nova] Acquiring lock "refresh_cache-45ccc6ec-6501-4477-9b94-1c0e3d1271d9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 776.407233] env[69648]: DEBUG oslo_concurrency.lockutils [req-5169f149-4ac1-4ebf-a2fb-6817116aa5e9 req-db07f232-f06e-46cc-a248-002812195cca service nova] Acquired lock "refresh_cache-45ccc6ec-6501-4477-9b94-1c0e3d1271d9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 776.407394] env[69648]: DEBUG nova.network.neutron [req-5169f149-4ac1-4ebf-a2fb-6817116aa5e9 req-db07f232-f06e-46cc-a248-002812195cca service nova] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Refreshing network info cache for port e926a684-f92e-4292-a834-1dc7af7ad0ae {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 776.759981] env[69648]: DEBUG nova.network.neutron [req-5169f149-4ac1-4ebf-a2fb-6817116aa5e9 req-db07f232-f06e-46cc-a248-002812195cca service nova] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Updated VIF entry in instance network info cache for port e926a684-f92e-4292-a834-1dc7af7ad0ae. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 776.760184] env[69648]: DEBUG nova.network.neutron [req-5169f149-4ac1-4ebf-a2fb-6817116aa5e9 req-db07f232-f06e-46cc-a248-002812195cca service nova] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Updating instance_info_cache with network_info: [{"id": "e926a684-f92e-4292-a834-1dc7af7ad0ae", "address": "fa:16:3e:2d:ea:e7", "network": {"id": "07cb66a4-31ee-4795-acb9-03b9394fe3fe", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1187613248-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4ba7bf91c1544e281527560cebedfb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape926a684-f9", "ovs_interfaceid": "e926a684-f92e-4292-a834-1dc7af7ad0ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 776.769614] env[69648]: DEBUG oslo_concurrency.lockutils [req-5169f149-4ac1-4ebf-a2fb-6817116aa5e9 req-db07f232-f06e-46cc-a248-002812195cca service nova] Releasing lock "refresh_cache-45ccc6ec-6501-4477-9b94-1c0e3d1271d9" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 776.828708] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 776.829353] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 776.829730] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 787.225435] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquiring lock "dfbb396b-8f18-456d-9064-be451cdd1ac9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 789.358116] env[69648]: DEBUG oslo_concurrency.lockutils [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquiring lock "928bc799-4fed-4005-89d2-e18196f88ffb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 795.501783] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "ed63f202-c76d-4492-b738-606ee1c6b059" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 796.473032] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 802.885613] env[69648]: DEBUG oslo_concurrency.lockutils [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 807.383785] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquiring lock "60b00251-25fc-483d-88fe-a84165d6a435" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 807.383785] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "60b00251-25fc-483d-88fe-a84165d6a435" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.287388] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3e643fd3-12c1-4144-9159-1bde5f48c2dd tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] Acquiring lock "401783f7-a434-4c01-8f9a-e3f5fecd10da" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 812.287875] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3e643fd3-12c1-4144-9159-1bde5f48c2dd tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] Lock "401783f7-a434-4c01-8f9a-e3f5fecd10da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 815.685011] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "6062dd02-230d-42bc-8304-fc122f1f1489" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 815.685309] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "6062dd02-230d-42bc-8304-fc122f1f1489" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 816.467691] env[69648]: WARNING oslo_vmware.rw_handles [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 816.467691] env[69648]: ERROR oslo_vmware.rw_handles [ 816.468142] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/52c21a2a-f8ae-47cd-99c0-95ab1d22fea7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 816.469915] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 816.470214] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Copying Virtual Disk [datastore1] vmware_temp/52c21a2a-f8ae-47cd-99c0-95ab1d22fea7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/52c21a2a-f8ae-47cd-99c0-95ab1d22fea7/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 816.470541] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6c4922d9-84d4-4222-be6c-c2144801eb09 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.479105] env[69648]: DEBUG oslo_vmware.api [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Waiting for the task: (returnval){ [ 816.479105] env[69648]: value = "task-3466501" [ 816.479105] env[69648]: _type = "Task" [ 816.479105] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 816.487121] env[69648]: DEBUG oslo_vmware.api [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Task: {'id': task-3466501, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 816.688288] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ee9717d6-9875-49d7-8ff5-2c5a9d52f95d tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "fc6cfa72-0132-4bf2-9054-b1064d3e4efb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 816.688655] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ee9717d6-9875-49d7-8ff5-2c5a9d52f95d tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "fc6cfa72-0132-4bf2-9054-b1064d3e4efb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 816.993892] env[69648]: DEBUG oslo_vmware.exceptions [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 816.994281] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 816.994840] env[69648]: ERROR nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 816.994840] env[69648]: Faults: ['InvalidArgument'] [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Traceback (most recent call last): [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] yield resources [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] self.driver.spawn(context, instance, image_meta, [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] self._fetch_image_if_missing(context, vi) [ 816.994840] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] image_cache(vi, tmp_image_ds_loc) [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] vm_util.copy_virtual_disk( [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] session._wait_for_task(vmdk_copy_task) [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] return self.wait_for_task(task_ref) [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] return evt.wait() [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] result = hub.switch() [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 816.995342] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] return self.greenlet.switch() [ 816.995651] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 816.995651] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] self.f(*self.args, **self.kw) [ 816.995651] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 816.995651] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] raise exceptions.translate_fault(task_info.error) [ 816.995651] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 816.995651] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Faults: ['InvalidArgument'] [ 816.995651] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] [ 816.995651] env[69648]: INFO nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Terminating instance [ 816.997073] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 816.997349] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 816.998143] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 816.998398] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 816.998695] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-922a1cd6-7509-4fed-a132-3b8810f15862 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.003140] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6d80160-3d35-44af-8bff-55de042f3abe {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.010722] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 817.011277] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9d121883-3160-41b8-932a-69ac1813089a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.015519] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 817.015703] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 817.016442] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-28dbd720-06f7-432a-af27-a00e875fe1cc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.024676] env[69648]: DEBUG oslo_vmware.api [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Waiting for the task: (returnval){ [ 817.024676] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52535067-3f7c-cbfe-2bb6-283807f05b85" [ 817.024676] env[69648]: _type = "Task" [ 817.024676] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 817.030960] env[69648]: DEBUG oslo_vmware.api [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52535067-3f7c-cbfe-2bb6-283807f05b85, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 817.085757] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 817.085968] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 817.086284] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Deleting the datastore file [datastore1] 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 817.086454] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7a86d953-1bc2-4909-bd17-042162d623e1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.093124] env[69648]: DEBUG oslo_vmware.api [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Waiting for the task: (returnval){ [ 817.093124] env[69648]: value = "task-3466503" [ 817.093124] env[69648]: _type = "Task" [ 817.093124] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 817.101386] env[69648]: DEBUG oslo_vmware.api [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Task: {'id': task-3466503, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 817.533115] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 817.533377] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Creating directory with path [datastore1] vmware_temp/e79c195f-876a-4d70-b7f8-dea459037272/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 817.533613] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b9c73f5d-5284-434d-bd26-6f9da1b7190e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.547606] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Created directory with path [datastore1] vmware_temp/e79c195f-876a-4d70-b7f8-dea459037272/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 817.551023] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Fetch image to [datastore1] vmware_temp/e79c195f-876a-4d70-b7f8-dea459037272/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 817.551023] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/e79c195f-876a-4d70-b7f8-dea459037272/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 817.551023] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6fbcb9c-0345-4e70-8383-11c3e3a9336b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.557310] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcb1aede-00b4-4bf1-adcd-2a8114e0e001 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.571318] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aff6bbd5-c94c-41b3-a347-e7a72fd30262 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.613654] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02ccd89d-47e3-409f-822f-591566a0747f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.625456] env[69648]: DEBUG oslo_vmware.api [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Task: {'id': task-3466503, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.125976} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 817.625456] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 817.625456] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 817.625456] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 817.625456] env[69648]: INFO nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Took 0.63 seconds to destroy the instance on the hypervisor. [ 817.627083] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8d2ba6fc-3d41-405e-8b7a-d32a6017949f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.629162] env[69648]: DEBUG nova.compute.claims [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 817.629332] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 817.629547] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 817.652814] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 817.738413] env[69648]: DEBUG oslo_vmware.rw_handles [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e79c195f-876a-4d70-b7f8-dea459037272/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 817.806880] env[69648]: DEBUG oslo_vmware.rw_handles [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 817.807075] env[69648]: DEBUG oslo_vmware.rw_handles [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e79c195f-876a-4d70-b7f8-dea459037272/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 817.880075] env[69648]: DEBUG oslo_concurrency.lockutils [None req-00385dc7-2d7d-432c-9eee-af054dcc92b8 tempest-FloatingIPsAssociationTestJSON-1510815455 tempest-FloatingIPsAssociationTestJSON-1510815455-project-member] Acquiring lock "a8a08a83-45f8-43d1-b405-52c751bc2e0a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 817.880323] env[69648]: DEBUG oslo_concurrency.lockutils [None req-00385dc7-2d7d-432c-9eee-af054dcc92b8 tempest-FloatingIPsAssociationTestJSON-1510815455 tempest-FloatingIPsAssociationTestJSON-1510815455-project-member] Lock "a8a08a83-45f8-43d1-b405-52c751bc2e0a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 818.134952] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb1c7a44-678b-47ea-ae9a-e939506b90a9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.143534] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a261c04-2a1c-4774-8329-a4f1b6912f73 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.179265] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8801767b-2985-429b-b8eb-01cdce7d571d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.188773] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbdc779f-6868-4cd4-9334-fd3d5eb45dca {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.205062] env[69648]: DEBUG nova.compute.provider_tree [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 818.213794] env[69648]: DEBUG nova.scheduler.client.report [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 818.231921] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.602s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 818.232047] env[69648]: ERROR nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 818.232047] env[69648]: Faults: ['InvalidArgument'] [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Traceback (most recent call last): [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] self.driver.spawn(context, instance, image_meta, [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] self._fetch_image_if_missing(context, vi) [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] image_cache(vi, tmp_image_ds_loc) [ 818.232047] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] vm_util.copy_virtual_disk( [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] session._wait_for_task(vmdk_copy_task) [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] return self.wait_for_task(task_ref) [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] return evt.wait() [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] result = hub.switch() [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] return self.greenlet.switch() [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 818.232386] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] self.f(*self.args, **self.kw) [ 818.232771] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 818.232771] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] raise exceptions.translate_fault(task_info.error) [ 818.232771] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 818.232771] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Faults: ['InvalidArgument'] [ 818.232771] env[69648]: ERROR nova.compute.manager [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] [ 818.232771] env[69648]: DEBUG nova.compute.utils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 818.235096] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Build of instance 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1 was re-scheduled: A specified parameter was not correct: fileType [ 818.235096] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 818.235403] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 818.235703] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 818.235917] env[69648]: DEBUG nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 818.236152] env[69648]: DEBUG nova.network.neutron [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 818.740315] env[69648]: DEBUG nova.network.neutron [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 818.757512] env[69648]: INFO nova.compute.manager [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Took 0.52 seconds to deallocate network for instance. [ 818.973465] env[69648]: INFO nova.scheduler.client.report [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Deleted allocations for instance 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1 [ 818.993174] env[69648]: DEBUG oslo_concurrency.lockutils [None req-638cb966-e3a5-4146-9e42-495dd8569808 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 245.030s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 818.993927] env[69648]: DEBUG oslo_concurrency.lockutils [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 46.888s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 818.994646] env[69648]: DEBUG oslo_concurrency.lockutils [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquiring lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 818.994876] env[69648]: DEBUG oslo_concurrency.lockutils [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 818.995255] env[69648]: DEBUG oslo_concurrency.lockutils [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 818.996936] env[69648]: INFO nova.compute.manager [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Terminating instance [ 818.998620] env[69648]: DEBUG nova.compute.manager [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 818.998839] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 818.999486] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d719e6b1-2719-4eaa-b83a-2ca50def34d2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.010064] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0ba7407-7add-4bf7-9ce7-db4a884f5ada {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.021910] env[69648]: DEBUG nova.compute.manager [None req-f9ae89c0-a4b6-4e91-939e-1300cc47ee17 tempest-InstanceActionsTestJSON-825947943 tempest-InstanceActionsTestJSON-825947943-project-member] [instance: 741e7ee9-8ee1-4b36-89cc-a640e6f6b97a] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.043023] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1 could not be found. [ 819.043910] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 819.043910] env[69648]: INFO nova.compute.manager [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 819.043910] env[69648]: DEBUG oslo.service.loopingcall [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 819.043910] env[69648]: DEBUG nova.compute.manager [-] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 819.044198] env[69648]: DEBUG nova.network.neutron [-] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 819.075264] env[69648]: DEBUG nova.compute.manager [None req-f9ae89c0-a4b6-4e91-939e-1300cc47ee17 tempest-InstanceActionsTestJSON-825947943 tempest-InstanceActionsTestJSON-825947943-project-member] [instance: 741e7ee9-8ee1-4b36-89cc-a640e6f6b97a] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.081770] env[69648]: DEBUG nova.network.neutron [-] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 819.091770] env[69648]: INFO nova.compute.manager [-] [instance: 9550f67e-b7b2-48a9-b918-e2f5b38f3bc1] Took 0.05 seconds to deallocate network for instance. [ 819.097427] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f9ae89c0-a4b6-4e91-939e-1300cc47ee17 tempest-InstanceActionsTestJSON-825947943 tempest-InstanceActionsTestJSON-825947943-project-member] Lock "741e7ee9-8ee1-4b36-89cc-a640e6f6b97a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.015s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.110197] env[69648]: DEBUG nova.compute.manager [None req-334a8787-b1c3-40b5-9de0-7c5ec68f89b6 tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] [instance: 28f086a0-3197-47a9-ad8d-6cfd3a59bfc3] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.166047] env[69648]: DEBUG nova.compute.manager [None req-334a8787-b1c3-40b5-9de0-7c5ec68f89b6 tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] [instance: 28f086a0-3197-47a9-ad8d-6cfd3a59bfc3] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.207703] env[69648]: DEBUG oslo_concurrency.lockutils [None req-334a8787-b1c3-40b5-9de0-7c5ec68f89b6 tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] Lock "28f086a0-3197-47a9-ad8d-6cfd3a59bfc3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.607s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.218922] env[69648]: DEBUG nova.compute.manager [None req-50bbdc90-ca31-454d-b67a-a3c1c164785e tempest-ServerActionsTestOtherA-1331415416 tempest-ServerActionsTestOtherA-1331415416-project-member] [instance: ca4953f5-de55-4476-a845-c633a073eb43] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.251317] env[69648]: DEBUG oslo_concurrency.lockutils [None req-967070dd-bed3-4416-81e3-69ad48249143 tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "9550f67e-b7b2-48a9-b918-e2f5b38f3bc1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.257s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.255553] env[69648]: DEBUG nova.compute.manager [None req-50bbdc90-ca31-454d-b67a-a3c1c164785e tempest-ServerActionsTestOtherA-1331415416 tempest-ServerActionsTestOtherA-1331415416-project-member] [instance: ca4953f5-de55-4476-a845-c633a073eb43] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.277218] env[69648]: DEBUG oslo_concurrency.lockutils [None req-50bbdc90-ca31-454d-b67a-a3c1c164785e tempest-ServerActionsTestOtherA-1331415416 tempest-ServerActionsTestOtherA-1331415416-project-member] Lock "ca4953f5-de55-4476-a845-c633a073eb43" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.078s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.297830] env[69648]: DEBUG nova.compute.manager [None req-549e9643-5f06-4610-b3c5-efe995377b89 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 51ebee08-c929-4485-b229-cd99d35db2f7] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.327130] env[69648]: DEBUG nova.compute.manager [None req-549e9643-5f06-4610-b3c5-efe995377b89 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 51ebee08-c929-4485-b229-cd99d35db2f7] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.354817] env[69648]: DEBUG oslo_concurrency.lockutils [None req-549e9643-5f06-4610-b3c5-efe995377b89 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "51ebee08-c929-4485-b229-cd99d35db2f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.085s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.372110] env[69648]: DEBUG nova.compute.manager [None req-c25321c7-35b8-47f4-bcb9-9be24352aabe tempest-ServerDiagnosticsNegativeTest-607771934 tempest-ServerDiagnosticsNegativeTest-607771934-project-member] [instance: 63c7b328-2c8a-42a6-b340-78528a656f9f] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.404552] env[69648]: DEBUG nova.compute.manager [None req-c25321c7-35b8-47f4-bcb9-9be24352aabe tempest-ServerDiagnosticsNegativeTest-607771934 tempest-ServerDiagnosticsNegativeTest-607771934-project-member] [instance: 63c7b328-2c8a-42a6-b340-78528a656f9f] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.434261] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c25321c7-35b8-47f4-bcb9-9be24352aabe tempest-ServerDiagnosticsNegativeTest-607771934 tempest-ServerDiagnosticsNegativeTest-607771934-project-member] Lock "63c7b328-2c8a-42a6-b340-78528a656f9f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.588s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.446085] env[69648]: DEBUG nova.compute.manager [None req-023c7526-4e6c-430d-8752-1d26e778481b tempest-VolumesAssistedSnapshotsTest-798663970 tempest-VolumesAssistedSnapshotsTest-798663970-project-member] [instance: 2b75aa03-750c-49b3-b69a-63bfee58942f] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.477288] env[69648]: DEBUG nova.compute.manager [None req-023c7526-4e6c-430d-8752-1d26e778481b tempest-VolumesAssistedSnapshotsTest-798663970 tempest-VolumesAssistedSnapshotsTest-798663970-project-member] [instance: 2b75aa03-750c-49b3-b69a-63bfee58942f] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.504686] env[69648]: DEBUG oslo_concurrency.lockutils [None req-023c7526-4e6c-430d-8752-1d26e778481b tempest-VolumesAssistedSnapshotsTest-798663970 tempest-VolumesAssistedSnapshotsTest-798663970-project-member] Lock "2b75aa03-750c-49b3-b69a-63bfee58942f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.894s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.515955] env[69648]: DEBUG nova.compute.manager [None req-0acf2512-9ceb-49fd-b443-30cc5a5afecc tempest-ServerMetadataTestJSON-622148251 tempest-ServerMetadataTestJSON-622148251-project-member] [instance: b21a0b12-14c7-491f-ae08-f596924490d3] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.545350] env[69648]: DEBUG nova.compute.manager [None req-0acf2512-9ceb-49fd-b443-30cc5a5afecc tempest-ServerMetadataTestJSON-622148251 tempest-ServerMetadataTestJSON-622148251-project-member] [instance: b21a0b12-14c7-491f-ae08-f596924490d3] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.571493] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0acf2512-9ceb-49fd-b443-30cc5a5afecc tempest-ServerMetadataTestJSON-622148251 tempest-ServerMetadataTestJSON-622148251-project-member] Lock "b21a0b12-14c7-491f-ae08-f596924490d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.945s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.581899] env[69648]: DEBUG nova.compute.manager [None req-1b67cd95-2280-4587-b9c1-43459d5a801e tempest-ImagesOneServerTestJSON-509649065 tempest-ImagesOneServerTestJSON-509649065-project-member] [instance: 267be1e0-7768-421d-9ae4-f8acb5331b23] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.606165] env[69648]: DEBUG nova.compute.manager [None req-1b67cd95-2280-4587-b9c1-43459d5a801e tempest-ImagesOneServerTestJSON-509649065 tempest-ImagesOneServerTestJSON-509649065-project-member] [instance: 267be1e0-7768-421d-9ae4-f8acb5331b23] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.633800] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1b67cd95-2280-4587-b9c1-43459d5a801e tempest-ImagesOneServerTestJSON-509649065 tempest-ImagesOneServerTestJSON-509649065-project-member] Lock "267be1e0-7768-421d-9ae4-f8acb5331b23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.823s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.647431] env[69648]: DEBUG nova.compute.manager [None req-17192bcf-4759-495b-b0f0-a84d6adb918d tempest-ServersWithSpecificFlavorTestJSON-1998279834 tempest-ServersWithSpecificFlavorTestJSON-1998279834-project-member] [instance: 50dd8f44-95ae-4b0c-ad88-2f11f4886d57] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.675158] env[69648]: DEBUG nova.compute.manager [None req-17192bcf-4759-495b-b0f0-a84d6adb918d tempest-ServersWithSpecificFlavorTestJSON-1998279834 tempest-ServersWithSpecificFlavorTestJSON-1998279834-project-member] [instance: 50dd8f44-95ae-4b0c-ad88-2f11f4886d57] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.699732] env[69648]: DEBUG oslo_concurrency.lockutils [None req-17192bcf-4759-495b-b0f0-a84d6adb918d tempest-ServersWithSpecificFlavorTestJSON-1998279834 tempest-ServersWithSpecificFlavorTestJSON-1998279834-project-member] Lock "50dd8f44-95ae-4b0c-ad88-2f11f4886d57" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.078s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.710752] env[69648]: DEBUG nova.compute.manager [None req-2327f0bc-09d4-4f66-bbd4-609e316788e6 tempest-ImagesNegativeTestJSON-660182022 tempest-ImagesNegativeTestJSON-660182022-project-member] [instance: 613910dc-5ba5-482f-8b77-bf978fe622dd] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.753399] env[69648]: DEBUG nova.compute.manager [None req-2327f0bc-09d4-4f66-bbd4-609e316788e6 tempest-ImagesNegativeTestJSON-660182022 tempest-ImagesNegativeTestJSON-660182022-project-member] [instance: 613910dc-5ba5-482f-8b77-bf978fe622dd] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.784328] env[69648]: DEBUG oslo_concurrency.lockutils [None req-2327f0bc-09d4-4f66-bbd4-609e316788e6 tempest-ImagesNegativeTestJSON-660182022 tempest-ImagesNegativeTestJSON-660182022-project-member] Lock "613910dc-5ba5-482f-8b77-bf978fe622dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.987s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.799187] env[69648]: DEBUG nova.compute.manager [None req-d6d75ef3-98ae-4962-8998-48cd9e466ac4 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] [instance: 6e484ac2-6437-488a-97e2-f5dedb5816c6] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.831321] env[69648]: DEBUG nova.compute.manager [None req-d6d75ef3-98ae-4962-8998-48cd9e466ac4 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] [instance: 6e484ac2-6437-488a-97e2-f5dedb5816c6] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.872020] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d6d75ef3-98ae-4962-8998-48cd9e466ac4 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] Lock "6e484ac2-6437-488a-97e2-f5dedb5816c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.892s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.886665] env[69648]: DEBUG nova.compute.manager [None req-0953c5eb-55fb-402c-829e-7d380eafab98 tempest-ServersAdminNegativeTestJSON-819959570 tempest-ServersAdminNegativeTestJSON-819959570-project-member] [instance: e0b188fb-3ec8-46c7-8966-ea4eaef2430b] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.913085] env[69648]: DEBUG nova.compute.manager [None req-0953c5eb-55fb-402c-829e-7d380eafab98 tempest-ServersAdminNegativeTestJSON-819959570 tempest-ServersAdminNegativeTestJSON-819959570-project-member] [instance: e0b188fb-3ec8-46c7-8966-ea4eaef2430b] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.964461] env[69648]: DEBUG oslo_concurrency.lockutils [None req-0953c5eb-55fb-402c-829e-7d380eafab98 tempest-ServersAdminNegativeTestJSON-819959570 tempest-ServersAdminNegativeTestJSON-819959570-project-member] Lock "e0b188fb-3ec8-46c7-8966-ea4eaef2430b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.814s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.975393] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 820.041048] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 820.041346] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 820.042924] env[69648]: INFO nova.compute.claims [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 820.474211] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bce6e660-ae04-41b1-b9c1-adf6b6a95b02 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.485148] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-747a7c96-253d-46cc-a33c-ad2b8097fdb9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.521045] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c0f6de8-2615-4b33-9c3d-8330e852346e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.527991] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-336d80ee-7524-4abb-9c95-bf1f85c74de4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.541699] env[69648]: DEBUG nova.compute.provider_tree [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 820.552672] env[69648]: DEBUG nova.scheduler.client.report [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 820.572162] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.531s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 820.572677] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 820.623312] env[69648]: DEBUG nova.compute.utils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 820.627622] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 820.627622] env[69648]: DEBUG nova.network.neutron [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 820.639122] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 820.734236] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 820.781811] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 820.781811] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 820.782314] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 820.784484] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 820.784484] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 820.784484] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 820.784484] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 820.784484] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 820.784736] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 820.784736] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 820.784736] env[69648]: DEBUG nova.virt.hardware [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 820.785706] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cecdae0e-fbcb-4c09-92ca-df380f8d2296 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.797397] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c540f03-c70d-4771-a286-f588ca0dbcc2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.819638] env[69648]: DEBUG nova.policy [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb29b1ed3b3043c984e2e40f6390df18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '39f31573c2f1444380010174c19fbdc2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 820.859817] env[69648]: DEBUG oslo_concurrency.lockutils [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquiring lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.324260] env[69648]: DEBUG oslo_concurrency.lockutils [None req-304fa61d-a4cb-4079-a7be-da3360b5863a tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Acquiring lock "0fe09233-c2e0-4d7b-b8df-689df7fdbced" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.324885] env[69648]: DEBUG oslo_concurrency.lockutils [None req-304fa61d-a4cb-4079-a7be-da3360b5863a tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "0fe09233-c2e0-4d7b-b8df-689df7fdbced" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 822.645737] env[69648]: DEBUG nova.network.neutron [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Successfully created port: 1a632e4a-570f-4a28-b686-5ed3bfd3464f {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 823.065536] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 823.065732] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 823.085561] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] There are 0 instances to clean {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 823.085696] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 823.086127] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances with incomplete migration {{(pid=69648) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 823.106906] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 824.115195] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 824.726968] env[69648]: DEBUG nova.network.neutron [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Successfully updated port: 1a632e4a-570f-4a28-b686-5ed3bfd3464f {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 824.744652] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquiring lock "refresh_cache-63b167e7-3d86-4ee4-8bae-bfb8fe084135" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 824.744652] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquired lock "refresh_cache-63b167e7-3d86-4ee4-8bae-bfb8fe084135" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 824.744777] env[69648]: DEBUG nova.network.neutron [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 824.860351] env[69648]: DEBUG nova.network.neutron [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 825.067781] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 825.068072] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 825.171247] env[69648]: DEBUG nova.compute.manager [req-6de22e30-a6d5-4f0b-aea5-c3ea1a0ea448 req-78ccdd23-ca53-4207-a1cc-d73d8bc195e4 service nova] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Received event network-vif-plugged-1a632e4a-570f-4a28-b686-5ed3bfd3464f {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 825.171247] env[69648]: DEBUG oslo_concurrency.lockutils [req-6de22e30-a6d5-4f0b-aea5-c3ea1a0ea448 req-78ccdd23-ca53-4207-a1cc-d73d8bc195e4 service nova] Acquiring lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 825.171247] env[69648]: DEBUG oslo_concurrency.lockutils [req-6de22e30-a6d5-4f0b-aea5-c3ea1a0ea448 req-78ccdd23-ca53-4207-a1cc-d73d8bc195e4 service nova] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 825.171597] env[69648]: DEBUG oslo_concurrency.lockutils [req-6de22e30-a6d5-4f0b-aea5-c3ea1a0ea448 req-78ccdd23-ca53-4207-a1cc-d73d8bc195e4 service nova] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 825.171597] env[69648]: DEBUG nova.compute.manager [req-6de22e30-a6d5-4f0b-aea5-c3ea1a0ea448 req-78ccdd23-ca53-4207-a1cc-d73d8bc195e4 service nova] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] No waiting events found dispatching network-vif-plugged-1a632e4a-570f-4a28-b686-5ed3bfd3464f {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 825.171814] env[69648]: WARNING nova.compute.manager [req-6de22e30-a6d5-4f0b-aea5-c3ea1a0ea448 req-78ccdd23-ca53-4207-a1cc-d73d8bc195e4 service nova] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Received unexpected event network-vif-plugged-1a632e4a-570f-4a28-b686-5ed3bfd3464f for instance with vm_state building and task_state deleting. [ 825.784715] env[69648]: DEBUG nova.network.neutron [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Updating instance_info_cache with network_info: [{"id": "1a632e4a-570f-4a28-b686-5ed3bfd3464f", "address": "fa:16:3e:10:db:c9", "network": {"id": "4a9f1b1d-1a12-434b-8916-0866a2d68872", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-941517195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "39f31573c2f1444380010174c19fbdc2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1a632e4a-57", "ovs_interfaceid": "1a632e4a-570f-4a28-b686-5ed3bfd3464f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 825.806156] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Releasing lock "refresh_cache-63b167e7-3d86-4ee4-8bae-bfb8fe084135" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 825.806156] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Instance network_info: |[{"id": "1a632e4a-570f-4a28-b686-5ed3bfd3464f", "address": "fa:16:3e:10:db:c9", "network": {"id": "4a9f1b1d-1a12-434b-8916-0866a2d68872", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-941517195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "39f31573c2f1444380010174c19fbdc2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1a632e4a-57", "ovs_interfaceid": "1a632e4a-570f-4a28-b686-5ed3bfd3464f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 825.806457] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:10:db:c9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1a632e4a-570f-4a28-b686-5ed3bfd3464f', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 825.820951] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Creating folder: Project (39f31573c2f1444380010174c19fbdc2). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 825.823319] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f6d117b8-1be2-49e8-98b1-ac24c78278d9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.839730] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Created folder: Project (39f31573c2f1444380010174c19fbdc2) in parent group-v692308. [ 825.839730] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Creating folder: Instances. Parent ref: group-v692345. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 825.839730] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fa9fbe11-c671-49ab-a802-bd20c105a301 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.848591] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Created folder: Instances in parent group-v692345. [ 825.848591] env[69648]: DEBUG oslo.service.loopingcall [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 825.848591] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 825.848823] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0a6f62ab-3ac3-43fc-972c-7ef24348630f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.872062] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 825.872062] env[69648]: value = "task-3466506" [ 825.872062] env[69648]: _type = "Task" [ 825.872062] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 825.880725] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466506, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 826.066667] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 826.066667] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 826.066667] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 826.089685] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.089804] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.089909] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.090054] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.090185] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.090311] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.090914] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.090914] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.090914] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.090914] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 826.090914] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 826.091464] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 826.383620] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466506, 'name': CreateVM_Task, 'duration_secs': 0.352904} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 826.384052] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 826.384531] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 826.385891] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 826.385891] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 826.385891] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b8369d75-0454-4a0a-8efd-3d40e118cbdb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 826.392956] env[69648]: DEBUG oslo_vmware.api [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Waiting for the task: (returnval){ [ 826.392956] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524f3604-e1af-e3ea-6897-d71626e341b3" [ 826.392956] env[69648]: _type = "Task" [ 826.392956] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 826.402575] env[69648]: DEBUG oslo_vmware.api [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524f3604-e1af-e3ea-6897-d71626e341b3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 826.904661] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 826.905031] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 826.905241] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 827.086381] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 827.562402] env[69648]: DEBUG nova.compute.manager [req-e28ee769-5c2e-49cc-ae72-99614cfa2cac req-85e2b33b-1e72-449f-a4b1-05ebb8665d23 service nova] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Received event network-changed-1a632e4a-570f-4a28-b686-5ed3bfd3464f {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 827.562646] env[69648]: DEBUG nova.compute.manager [req-e28ee769-5c2e-49cc-ae72-99614cfa2cac req-85e2b33b-1e72-449f-a4b1-05ebb8665d23 service nova] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Refreshing instance network info cache due to event network-changed-1a632e4a-570f-4a28-b686-5ed3bfd3464f. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 827.562846] env[69648]: DEBUG oslo_concurrency.lockutils [req-e28ee769-5c2e-49cc-ae72-99614cfa2cac req-85e2b33b-1e72-449f-a4b1-05ebb8665d23 service nova] Acquiring lock "refresh_cache-63b167e7-3d86-4ee4-8bae-bfb8fe084135" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 827.562988] env[69648]: DEBUG oslo_concurrency.lockutils [req-e28ee769-5c2e-49cc-ae72-99614cfa2cac req-85e2b33b-1e72-449f-a4b1-05ebb8665d23 service nova] Acquired lock "refresh_cache-63b167e7-3d86-4ee4-8bae-bfb8fe084135" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 827.563163] env[69648]: DEBUG nova.network.neutron [req-e28ee769-5c2e-49cc-ae72-99614cfa2cac req-85e2b33b-1e72-449f-a4b1-05ebb8665d23 service nova] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Refreshing network info cache for port 1a632e4a-570f-4a28-b686-5ed3bfd3464f {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 828.065180] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 828.065437] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 828.065586] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 828.065744] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 828.078808] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 828.081418] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 828.081418] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 828.081418] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 828.082219] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-931378c4-7b3e-4233-a8a2-3fca277c2ea3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.097605] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1408c18e-0157-4018-a0b6-90de3e12d7a8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.117215] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2e84cef-52d3-475c-be69-0ca14511b295 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.128284] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-848a288b-9439-4080-a9d1-9fb9512ff40c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.167281] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180991MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 828.168019] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 828.168019] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 828.193327] env[69648]: DEBUG nova.network.neutron [req-e28ee769-5c2e-49cc-ae72-99614cfa2cac req-85e2b33b-1e72-449f-a4b1-05ebb8665d23 service nova] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Updated VIF entry in instance network info cache for port 1a632e4a-570f-4a28-b686-5ed3bfd3464f. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 828.194656] env[69648]: DEBUG nova.network.neutron [req-e28ee769-5c2e-49cc-ae72-99614cfa2cac req-85e2b33b-1e72-449f-a4b1-05ebb8665d23 service nova] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Updating instance_info_cache with network_info: [{"id": "1a632e4a-570f-4a28-b686-5ed3bfd3464f", "address": "fa:16:3e:10:db:c9", "network": {"id": "4a9f1b1d-1a12-434b-8916-0866a2d68872", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-941517195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "39f31573c2f1444380010174c19fbdc2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1a632e4a-57", "ovs_interfaceid": "1a632e4a-570f-4a28-b686-5ed3bfd3464f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 828.218973] env[69648]: DEBUG oslo_concurrency.lockutils [req-e28ee769-5c2e-49cc-ae72-99614cfa2cac req-85e2b33b-1e72-449f-a4b1-05ebb8665d23 service nova] Releasing lock "refresh_cache-63b167e7-3d86-4ee4-8bae-bfb8fe084135" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 828.370055] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ce04d2df-8587-4cda-93b1-cad7ba3ff670 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.370055] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance dfbb396b-8f18-456d-9064-be451cdd1ac9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.370055] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 20bce654-7f57-4de6-8f7a-c1b34286fc86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.370055] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 928bc799-4fed-4005-89d2-e18196f88ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.370501] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ed63f202-c76d-4492-b738-606ee1c6b059 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.370501] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.370501] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.370501] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.370669] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.370669] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 828.386992] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2c03e5ce-0ebd-40b3-982a-f0f7d4742dde has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.400616] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 72591ff3-6bb9-4b0a-9f38-fa6111f74408 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.413784] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 19e7f1b0-5cd9-453f-8600-a7d76487de87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.424753] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 2dd9d8b9-8944-43cf-989b-e07354e29d40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.437681] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance dd0a5705-5745-439a-9fe2-23b852b86c2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.452548] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 12ddb227-b0e5-47cf-92b0-5c7338c1120e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.467762] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.486025] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.498464] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.510016] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 401783f7-a434-4c01-8f9a-e3f5fecd10da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.521580] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.537340] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc6cfa72-0132-4bf2-9054-b1064d3e4efb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.548470] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a8a08a83-45f8-43d1-b405-52c751bc2e0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.561531] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0fe09233-c2e0-4d7b-b8df-689df7fdbced has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 828.564896] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 828.564896] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 828.587391] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing inventories for resource provider d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 828.604026] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating ProviderTree inventory for provider d38a352b-7808-44da-8216-792e96aadc88 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 828.604026] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating inventory in ProviderTree for provider d38a352b-7808-44da-8216-792e96aadc88 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 828.622553] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing aggregate associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, aggregates: None {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 828.651133] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing trait associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 829.114685] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5644a31-2820-4c09-bd44-38e2e688bf31 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.129144] env[69648]: DEBUG oslo_concurrency.lockutils [None req-744bd89d-dce4-4b00-a344-e54ae2f4cafe tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquiring lock "933d6ead-8da8-43cd-9f02-9373bad0348d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 829.129890] env[69648]: DEBUG oslo_concurrency.lockutils [None req-744bd89d-dce4-4b00-a344-e54ae2f4cafe tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "933d6ead-8da8-43cd-9f02-9373bad0348d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 829.131458] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84dd3417-5ad4-43d8-9a37-7813bf13f59d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.166946] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15fcdbbf-7971-47bd-8b7d-acf8a90415a3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.176894] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-524b9a4e-ab4b-496e-9562-ce21fb45c48e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.191230] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 829.207343] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 829.238129] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 829.238841] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.071s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.602477] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e9c449a6-6ea0-4713-b79b-77f0a4370b50 tempest-ServersNegativeTestMultiTenantJSON-629123672 tempest-ServersNegativeTestMultiTenantJSON-629123672-project-member] Acquiring lock "3968f404-57ba-4088-b516-eb9c085f6b75" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 830.602477] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e9c449a6-6ea0-4713-b79b-77f0a4370b50 tempest-ServersNegativeTestMultiTenantJSON-629123672 tempest-ServersNegativeTestMultiTenantJSON-629123672-project-member] Lock "3968f404-57ba-4088-b516-eb9c085f6b75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 831.834155] env[69648]: DEBUG oslo_concurrency.lockutils [None req-943063f8-f51e-4ac6-b5d4-936fd78123e0 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] Acquiring lock "8b1d1227-c6ec-4f6a-8076-b8e4b4efa12a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 831.834461] env[69648]: DEBUG oslo_concurrency.lockutils [None req-943063f8-f51e-4ac6-b5d4-936fd78123e0 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] Lock "8b1d1227-c6ec-4f6a-8076-b8e4b4efa12a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 834.237982] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e40de771-b591-44fb-941e-c55309602b66 tempest-AttachInterfacesV270Test-364647713 tempest-AttachInterfacesV270Test-364647713-project-member] Acquiring lock "7532d0c5-20f4-4b64-85f1-e7b16d15acf8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 834.237982] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e40de771-b591-44fb-941e-c55309602b66 tempest-AttachInterfacesV270Test-364647713 tempest-AttachInterfacesV270Test-364647713-project-member] Lock "7532d0c5-20f4-4b64-85f1-e7b16d15acf8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 840.334800] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1fef200c-e96d-4a17-8671-cbe43faaf80d tempest-ServerAddressesNegativeTestJSON-699051968 tempest-ServerAddressesNegativeTestJSON-699051968-project-member] Acquiring lock "6f31fc53-7a85-4db2-977a-a02f174c1eca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 840.335184] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1fef200c-e96d-4a17-8671-cbe43faaf80d tempest-ServerAddressesNegativeTestJSON-699051968 tempest-ServerAddressesNegativeTestJSON-699051968-project-member] Lock "6f31fc53-7a85-4db2-977a-a02f174c1eca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 842.981456] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4f95580b-76a3-47e0-973f-54e35879eac8 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] Acquiring lock "c6473d99-b222-4b4b-8d2d-61876e54dc43" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 842.981783] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4f95580b-76a3-47e0-973f-54e35879eac8 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] Lock "c6473d99-b222-4b4b-8d2d-61876e54dc43" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 843.062598] env[69648]: DEBUG oslo_concurrency.lockutils [None req-70a2bcf3-7710-44b7-9547-1bd373d1a7d9 tempest-ServerRescueTestJSONUnderV235-182724449 tempest-ServerRescueTestJSONUnderV235-182724449-project-member] Acquiring lock "da62948a-a57e-4a0a-9fad-fc7de9f5f878" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 843.062598] env[69648]: DEBUG oslo_concurrency.lockutils [None req-70a2bcf3-7710-44b7-9547-1bd373d1a7d9 tempest-ServerRescueTestJSONUnderV235-182724449 tempest-ServerRescueTestJSONUnderV235-182724449-project-member] Lock "da62948a-a57e-4a0a-9fad-fc7de9f5f878" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 844.369869] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9497bfe8-fa77-428c-8856-f25356d886a1 tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] Acquiring lock "b3a99599-9514-4702-b01e-95ccb064ed4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 844.370450] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9497bfe8-fa77-428c-8856-f25356d886a1 tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] Lock "b3a99599-9514-4702-b01e-95ccb064ed4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 845.738185] env[69648]: DEBUG oslo_concurrency.lockutils [None req-43b0abd1-b7e8-48ef-b369-b97aec3aee5e tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] Acquiring lock "7ca86d88-f679-40bd-a46d-20c39fe13247" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 845.738513] env[69648]: DEBUG oslo_concurrency.lockutils [None req-43b0abd1-b7e8-48ef-b369-b97aec3aee5e tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] Lock "7ca86d88-f679-40bd-a46d-20c39fe13247" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 848.178247] env[69648]: DEBUG oslo_concurrency.lockutils [None req-cd14b2a8-a587-4a25-b3cd-dcfae09d49ed tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "0fa3ac73-db70-4034-91da-29e42cefc471" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 848.178527] env[69648]: DEBUG oslo_concurrency.lockutils [None req-cd14b2a8-a587-4a25-b3cd-dcfae09d49ed tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "0fa3ac73-db70-4034-91da-29e42cefc471" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 858.208087] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e80aef8e-65f1-435d-b5e0-21ee3ca34dcf tempest-ServerActionsV293TestJSON-1131934826 tempest-ServerActionsV293TestJSON-1131934826-project-member] Acquiring lock "4f321d17-20ad-49d2-9952-c27e1161f339" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 858.208374] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e80aef8e-65f1-435d-b5e0-21ee3ca34dcf tempest-ServerActionsV293TestJSON-1131934826 tempest-ServerActionsV293TestJSON-1131934826-project-member] Lock "4f321d17-20ad-49d2-9952-c27e1161f339" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 866.489233] env[69648]: WARNING oslo_vmware.rw_handles [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 866.489233] env[69648]: ERROR oslo_vmware.rw_handles [ 866.490134] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/e79c195f-876a-4d70-b7f8-dea459037272/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 866.491424] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 866.491685] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Copying Virtual Disk [datastore1] vmware_temp/e79c195f-876a-4d70-b7f8-dea459037272/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/e79c195f-876a-4d70-b7f8-dea459037272/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 866.492128] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8beda8d5-c125-497a-83cf-17058b82aeba {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 866.500745] env[69648]: DEBUG oslo_vmware.api [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Waiting for the task: (returnval){ [ 866.500745] env[69648]: value = "task-3466518" [ 866.500745] env[69648]: _type = "Task" [ 866.500745] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 866.508892] env[69648]: DEBUG oslo_vmware.api [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Task: {'id': task-3466518, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 867.011487] env[69648]: DEBUG oslo_vmware.exceptions [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 867.011651] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 867.012142] env[69648]: ERROR nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 867.012142] env[69648]: Faults: ['InvalidArgument'] [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Traceback (most recent call last): [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] yield resources [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] self.driver.spawn(context, instance, image_meta, [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] self._vmops.spawn(context, instance, image_meta, injected_files, [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] self._fetch_image_if_missing(context, vi) [ 867.012142] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] image_cache(vi, tmp_image_ds_loc) [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] vm_util.copy_virtual_disk( [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] session._wait_for_task(vmdk_copy_task) [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] return self.wait_for_task(task_ref) [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] return evt.wait() [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] result = hub.switch() [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 867.012528] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] return self.greenlet.switch() [ 867.013179] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 867.013179] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] self.f(*self.args, **self.kw) [ 867.013179] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 867.013179] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] raise exceptions.translate_fault(task_info.error) [ 867.013179] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 867.013179] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Faults: ['InvalidArgument'] [ 867.013179] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] [ 867.013179] env[69648]: INFO nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Terminating instance [ 867.014097] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 867.014312] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 867.014550] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bea28b9d-593a-48e1-bdee-fd044cc45819 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.016782] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 867.016959] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 867.017711] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca2896a1-9a51-4d90-9437-aee2a34f0a2b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.024373] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 867.024595] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-12b13535-6d4d-490f-88de-69704fb6eb03 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.026852] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 867.027033] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 867.028015] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-98326992-6b09-406d-b28f-c52fa82ee51e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.032514] env[69648]: DEBUG oslo_vmware.api [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Waiting for the task: (returnval){ [ 867.032514] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52e4de37-86f8-d11c-8827-dbd0e7e0d5a2" [ 867.032514] env[69648]: _type = "Task" [ 867.032514] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 867.039667] env[69648]: DEBUG oslo_vmware.api [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52e4de37-86f8-d11c-8827-dbd0e7e0d5a2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 867.087722] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 867.087957] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 867.089071] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Deleting the datastore file [datastore1] ce04d2df-8587-4cda-93b1-cad7ba3ff670 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 867.089071] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7c70d2cb-c61e-4f05-8e08-52539a939d4e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.096189] env[69648]: DEBUG oslo_vmware.api [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Waiting for the task: (returnval){ [ 867.096189] env[69648]: value = "task-3466520" [ 867.096189] env[69648]: _type = "Task" [ 867.096189] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 867.104379] env[69648]: DEBUG oslo_vmware.api [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Task: {'id': task-3466520, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 867.545027] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 867.545027] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Creating directory with path [datastore1] vmware_temp/54d43a57-8d7a-4f57-8637-cbd3d9b45239/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 867.545027] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-733a9d22-dd24-4758-a780-31d64cb88b4b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.555758] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Created directory with path [datastore1] vmware_temp/54d43a57-8d7a-4f57-8637-cbd3d9b45239/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 867.555962] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Fetch image to [datastore1] vmware_temp/54d43a57-8d7a-4f57-8637-cbd3d9b45239/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 867.556221] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/54d43a57-8d7a-4f57-8637-cbd3d9b45239/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 867.556976] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-007cfda5-7a0b-4382-be98-258a4a144631 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.564169] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4da1a0e-5039-45ba-8fe9-9a18d70468bf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.575071] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-692ea956-a3ac-4d08-b39f-147b4b3f6f30 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.609096] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-331b8707-14b1-4d45-bfd5-6362c3dbd5c4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.616086] env[69648]: DEBUG oslo_vmware.api [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Task: {'id': task-3466520, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065464} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 867.617502] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 867.617699] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 867.617873] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 867.618062] env[69648]: INFO nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Took 0.60 seconds to destroy the instance on the hypervisor. [ 867.620096] env[69648]: DEBUG nova.compute.claims [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 867.620268] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 867.620481] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 867.622835] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3396ce80-8ffd-4f3c-b0bb-d43b74629c0d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.643753] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 867.696439] env[69648]: DEBUG oslo_vmware.rw_handles [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/54d43a57-8d7a-4f57-8637-cbd3d9b45239/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 867.757707] env[69648]: DEBUG oslo_vmware.rw_handles [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 867.757916] env[69648]: DEBUG oslo_vmware.rw_handles [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/54d43a57-8d7a-4f57-8637-cbd3d9b45239/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 868.048291] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57e30a48-a483-46d6-bfc9-7d5b41ed0547 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.055775] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0fc6c70-9d2f-42f0-9e49-7148be6052b5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.087341] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64baeba3-a31b-4a2b-a757-76ccdf15d622 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.095195] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fa13c20-e82d-4e0a-b651-a51a3249b4dc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.109293] env[69648]: DEBUG nova.compute.provider_tree [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 868.118406] env[69648]: DEBUG nova.scheduler.client.report [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 868.134051] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.513s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 868.134563] env[69648]: ERROR nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 868.134563] env[69648]: Faults: ['InvalidArgument'] [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Traceback (most recent call last): [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] self.driver.spawn(context, instance, image_meta, [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] self._vmops.spawn(context, instance, image_meta, injected_files, [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] self._fetch_image_if_missing(context, vi) [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] image_cache(vi, tmp_image_ds_loc) [ 868.134563] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] vm_util.copy_virtual_disk( [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] session._wait_for_task(vmdk_copy_task) [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] return self.wait_for_task(task_ref) [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] return evt.wait() [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] result = hub.switch() [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] return self.greenlet.switch() [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 868.134934] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] self.f(*self.args, **self.kw) [ 868.135292] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 868.135292] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] raise exceptions.translate_fault(task_info.error) [ 868.135292] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 868.135292] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Faults: ['InvalidArgument'] [ 868.135292] env[69648]: ERROR nova.compute.manager [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] [ 868.135292] env[69648]: DEBUG nova.compute.utils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 868.136920] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Build of instance ce04d2df-8587-4cda-93b1-cad7ba3ff670 was re-scheduled: A specified parameter was not correct: fileType [ 868.136920] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 868.137323] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 868.137499] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 868.137678] env[69648]: DEBUG nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 868.137838] env[69648]: DEBUG nova.network.neutron [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 868.752837] env[69648]: DEBUG nova.network.neutron [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 868.772545] env[69648]: INFO nova.compute.manager [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Took 0.63 seconds to deallocate network for instance. [ 868.887083] env[69648]: INFO nova.scheduler.client.report [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Deleted allocations for instance ce04d2df-8587-4cda-93b1-cad7ba3ff670 [ 868.914513] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d5379eb-f2d3-4c7a-8dad-e563044021f4 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 291.415s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 868.915566] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 94.169s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 868.915908] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Acquiring lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 868.916482] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 868.916674] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 868.918679] env[69648]: INFO nova.compute.manager [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Terminating instance [ 868.921077] env[69648]: DEBUG nova.compute.manager [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 868.921299] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 868.921563] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9f0c1943-6b21-43b7-997d-c6a49c2edc8f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.931166] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdf7e4ab-f91e-4dfb-91bb-dc562e6104c6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.942995] env[69648]: DEBUG nova.compute.manager [None req-72356b40-a78c-4c8a-b09e-7707263e8788 tempest-ServerGroupTestJSON-191439854 tempest-ServerGroupTestJSON-191439854-project-member] [instance: c411e626-dff7-4999-8e69-9716f322d518] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 868.966577] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ce04d2df-8587-4cda-93b1-cad7ba3ff670 could not be found. [ 868.967752] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 868.967966] env[69648]: INFO nova.compute.manager [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Took 0.05 seconds to destroy the instance on the hypervisor. [ 868.968269] env[69648]: DEBUG oslo.service.loopingcall [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 868.971517] env[69648]: DEBUG nova.compute.manager [None req-72356b40-a78c-4c8a-b09e-7707263e8788 tempest-ServerGroupTestJSON-191439854 tempest-ServerGroupTestJSON-191439854-project-member] [instance: c411e626-dff7-4999-8e69-9716f322d518] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 868.971517] env[69648]: DEBUG nova.compute.manager [-] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 868.971517] env[69648]: DEBUG nova.network.neutron [-] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 868.999276] env[69648]: DEBUG oslo_concurrency.lockutils [None req-72356b40-a78c-4c8a-b09e-7707263e8788 tempest-ServerGroupTestJSON-191439854 tempest-ServerGroupTestJSON-191439854-project-member] Lock "c411e626-dff7-4999-8e69-9716f322d518" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.892s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.002956] env[69648]: DEBUG nova.network.neutron [-] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 869.009771] env[69648]: DEBUG nova.compute.manager [None req-b847d18d-9172-45f8-ab9a-4a9d0c89a846 tempest-ServerMetadataNegativeTestJSON-2070852957 tempest-ServerMetadataNegativeTestJSON-2070852957-project-member] [instance: 2c03e5ce-0ebd-40b3-982a-f0f7d4742dde] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 869.016019] env[69648]: INFO nova.compute.manager [-] [instance: ce04d2df-8587-4cda-93b1-cad7ba3ff670] Took 0.04 seconds to deallocate network for instance. [ 869.048481] env[69648]: DEBUG nova.compute.manager [None req-b847d18d-9172-45f8-ab9a-4a9d0c89a846 tempest-ServerMetadataNegativeTestJSON-2070852957 tempest-ServerMetadataNegativeTestJSON-2070852957-project-member] [instance: 2c03e5ce-0ebd-40b3-982a-f0f7d4742dde] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 869.072150] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b847d18d-9172-45f8-ab9a-4a9d0c89a846 tempest-ServerMetadataNegativeTestJSON-2070852957 tempest-ServerMetadataNegativeTestJSON-2070852957-project-member] Lock "2c03e5ce-0ebd-40b3-982a-f0f7d4742dde" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.720s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.082269] env[69648]: DEBUG nova.compute.manager [None req-7ce956e2-48f9-4060-b92a-2477aab38cb5 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] [instance: 72591ff3-6bb9-4b0a-9f38-fa6111f74408] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 869.105643] env[69648]: DEBUG nova.compute.manager [None req-7ce956e2-48f9-4060-b92a-2477aab38cb5 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] [instance: 72591ff3-6bb9-4b0a-9f38-fa6111f74408] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 869.120604] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1881e2f7-2114-459e-9f39-94a04e8911c2 tempest-FloatingIPsAssociationNegativeTestJSON-897893609 tempest-FloatingIPsAssociationNegativeTestJSON-897893609-project-member] Lock "ce04d2df-8587-4cda-93b1-cad7ba3ff670" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.205s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.131852] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7ce956e2-48f9-4060-b92a-2477aab38cb5 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] Lock "72591ff3-6bb9-4b0a-9f38-fa6111f74408" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.098s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.139322] env[69648]: DEBUG nova.compute.manager [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] [instance: 19e7f1b0-5cd9-453f-8600-a7d76487de87] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 869.160956] env[69648]: DEBUG nova.compute.manager [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] [instance: 19e7f1b0-5cd9-453f-8600-a7d76487de87] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 869.180748] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] Lock "19e7f1b0-5cd9-453f-8600-a7d76487de87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.848s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.189513] env[69648]: DEBUG nova.compute.manager [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] [instance: 2dd9d8b9-8944-43cf-989b-e07354e29d40] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 869.210170] env[69648]: DEBUG nova.compute.manager [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] [instance: 2dd9d8b9-8944-43cf-989b-e07354e29d40] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 869.228666] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] Lock "2dd9d8b9-8944-43cf-989b-e07354e29d40" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.864s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.238539] env[69648]: DEBUG nova.compute.manager [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] [instance: dd0a5705-5745-439a-9fe2-23b852b86c2c] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 869.260202] env[69648]: DEBUG nova.compute.manager [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] [instance: dd0a5705-5745-439a-9fe2-23b852b86c2c] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 869.280362] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9aefe562-2701-46f2-a6ae-acd161a332d1 tempest-ListServersNegativeTestJSON-1221185087 tempest-ListServersNegativeTestJSON-1221185087-project-member] Lock "dd0a5705-5745-439a-9fe2-23b852b86c2c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.870s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.288689] env[69648]: DEBUG nova.compute.manager [None req-778af14a-f1b6-46b9-890c-f5cc47f441fd tempest-InstanceActionsV221TestJSON-1043921942 tempest-InstanceActionsV221TestJSON-1043921942-project-member] [instance: 12ddb227-b0e5-47cf-92b0-5c7338c1120e] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 869.312201] env[69648]: DEBUG nova.compute.manager [None req-778af14a-f1b6-46b9-890c-f5cc47f441fd tempest-InstanceActionsV221TestJSON-1043921942 tempest-InstanceActionsV221TestJSON-1043921942-project-member] [instance: 12ddb227-b0e5-47cf-92b0-5c7338c1120e] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 869.332218] env[69648]: DEBUG oslo_concurrency.lockutils [None req-778af14a-f1b6-46b9-890c-f5cc47f441fd tempest-InstanceActionsV221TestJSON-1043921942 tempest-InstanceActionsV221TestJSON-1043921942-project-member] Lock "12ddb227-b0e5-47cf-92b0-5c7338c1120e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.653s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.341244] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 869.389479] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 869.389740] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 869.391323] env[69648]: INFO nova.compute.claims [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 869.736268] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e92eea2b-5f22-46d8-9d5b-d757a0470181 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.743778] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f148be37-9e1d-419a-8037-2de88230a3fb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.781810] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-905ec183-3cc9-4d00-83f8-d3cbd608ffa3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.789674] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb3b6c84-fb3a-40d1-8ea9-d9958223b55c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.806066] env[69648]: DEBUG nova.compute.provider_tree [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 869.815675] env[69648]: DEBUG nova.scheduler.client.report [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 869.829473] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.440s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.829954] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 869.866914] env[69648]: DEBUG nova.compute.utils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 869.869803] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 869.869803] env[69648]: DEBUG nova.network.neutron [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 869.882083] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 869.956088] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 869.988640] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 869.988926] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 869.989246] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 869.989484] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 869.989608] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 869.989774] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 869.989976] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 869.990510] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 869.990741] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 869.990922] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 869.991117] env[69648]: DEBUG nova.virt.hardware [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 869.993299] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7feef63-50ff-4eee-8c6a-bc6b845d7b9b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.001009] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ed9b4f1-ea48-420c-8fae-ca5508d66af1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.007412] env[69648]: DEBUG nova.policy [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9fedfd945c03449d9a5d450454bf9039', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a2656c7dd004cbb9418c9fe7e1f144d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 870.779977] env[69648]: DEBUG nova.network.neutron [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Successfully created port: a2948875-9ca8-4e4e-8724-545119122a58 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 871.509159] env[69648]: DEBUG oslo_concurrency.lockutils [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquiring lock "62954fe5-a462-40bd-85ec-d03b98d2ec42" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 871.985301] env[69648]: DEBUG nova.network.neutron [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Successfully updated port: a2948875-9ca8-4e4e-8724-545119122a58 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 872.007226] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquiring lock "refresh_cache-62954fe5-a462-40bd-85ec-d03b98d2ec42" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 872.007376] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquired lock "refresh_cache-62954fe5-a462-40bd-85ec-d03b98d2ec42" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 872.007539] env[69648]: DEBUG nova.network.neutron [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 872.094937] env[69648]: DEBUG nova.network.neutron [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 872.173189] env[69648]: DEBUG nova.compute.manager [req-f3bf893b-cbee-44d8-ac88-1cd0f099bed2 req-b8699bfd-ae35-43ab-9148-f6c53b6c09c5 service nova] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Received event network-vif-plugged-a2948875-9ca8-4e4e-8724-545119122a58 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 872.173189] env[69648]: DEBUG oslo_concurrency.lockutils [req-f3bf893b-cbee-44d8-ac88-1cd0f099bed2 req-b8699bfd-ae35-43ab-9148-f6c53b6c09c5 service nova] Acquiring lock "62954fe5-a462-40bd-85ec-d03b98d2ec42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 872.173189] env[69648]: DEBUG oslo_concurrency.lockutils [req-f3bf893b-cbee-44d8-ac88-1cd0f099bed2 req-b8699bfd-ae35-43ab-9148-f6c53b6c09c5 service nova] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 872.173189] env[69648]: DEBUG oslo_concurrency.lockutils [req-f3bf893b-cbee-44d8-ac88-1cd0f099bed2 req-b8699bfd-ae35-43ab-9148-f6c53b6c09c5 service nova] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 872.173341] env[69648]: DEBUG nova.compute.manager [req-f3bf893b-cbee-44d8-ac88-1cd0f099bed2 req-b8699bfd-ae35-43ab-9148-f6c53b6c09c5 service nova] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] No waiting events found dispatching network-vif-plugged-a2948875-9ca8-4e4e-8724-545119122a58 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 872.173341] env[69648]: WARNING nova.compute.manager [req-f3bf893b-cbee-44d8-ac88-1cd0f099bed2 req-b8699bfd-ae35-43ab-9148-f6c53b6c09c5 service nova] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Received unexpected event network-vif-plugged-a2948875-9ca8-4e4e-8724-545119122a58 for instance with vm_state building and task_state deleting. [ 872.318749] env[69648]: DEBUG nova.network.neutron [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Updating instance_info_cache with network_info: [{"id": "a2948875-9ca8-4e4e-8724-545119122a58", "address": "fa:16:3e:06:88:66", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2948875-9c", "ovs_interfaceid": "a2948875-9ca8-4e4e-8724-545119122a58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 872.337457] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Releasing lock "refresh_cache-62954fe5-a462-40bd-85ec-d03b98d2ec42" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 872.337829] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Instance network_info: |[{"id": "a2948875-9ca8-4e4e-8724-545119122a58", "address": "fa:16:3e:06:88:66", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2948875-9c", "ovs_interfaceid": "a2948875-9ca8-4e4e-8724-545119122a58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 872.338299] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:06:88:66', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92233552-2c0c-416e-9bf3-bfcca8eda2dc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a2948875-9ca8-4e4e-8724-545119122a58', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 872.346703] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Creating folder: Project (4a2656c7dd004cbb9418c9fe7e1f144d). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 872.347354] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ee622a95-3cd7-41ce-a25b-0cbef10e2d5a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.360523] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Created folder: Project (4a2656c7dd004cbb9418c9fe7e1f144d) in parent group-v692308. [ 872.360721] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Creating folder: Instances. Parent ref: group-v692352. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 872.360958] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c945e5ce-6cc5-4c6d-8020-8ddd4e830b1f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.369663] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Created folder: Instances in parent group-v692352. [ 872.369909] env[69648]: DEBUG oslo.service.loopingcall [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 872.370120] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 872.370331] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-244be745-eff7-4f01-b75b-ed1400489bf6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.391957] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 872.391957] env[69648]: value = "task-3466523" [ 872.391957] env[69648]: _type = "Task" [ 872.391957] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 872.400973] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466523, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 872.907655] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466523, 'name': CreateVM_Task, 'duration_secs': 0.277009} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 872.907655] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 872.907655] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 872.907655] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 872.907655] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 872.908212] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-72cb2bd5-36a0-40e0-bc51-d1e6b0335ce1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.912024] env[69648]: DEBUG oslo_vmware.api [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Waiting for the task: (returnval){ [ 872.912024] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]529ccb38-3a28-c5fb-75a2-509de9a6ad55" [ 872.912024] env[69648]: _type = "Task" [ 872.912024] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 872.919224] env[69648]: DEBUG oslo_vmware.api [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]529ccb38-3a28-c5fb-75a2-509de9a6ad55, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 873.420330] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 873.420595] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 873.420812] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 874.473988] env[69648]: DEBUG nova.compute.manager [req-1f9863cc-552d-4dc0-919e-1b6379778acd req-eeb4495a-a2e9-4a0e-a027-e22f21377e5b service nova] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Received event network-changed-a2948875-9ca8-4e4e-8724-545119122a58 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 874.474218] env[69648]: DEBUG nova.compute.manager [req-1f9863cc-552d-4dc0-919e-1b6379778acd req-eeb4495a-a2e9-4a0e-a027-e22f21377e5b service nova] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Refreshing instance network info cache due to event network-changed-a2948875-9ca8-4e4e-8724-545119122a58. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 874.474372] env[69648]: DEBUG oslo_concurrency.lockutils [req-1f9863cc-552d-4dc0-919e-1b6379778acd req-eeb4495a-a2e9-4a0e-a027-e22f21377e5b service nova] Acquiring lock "refresh_cache-62954fe5-a462-40bd-85ec-d03b98d2ec42" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 874.474463] env[69648]: DEBUG oslo_concurrency.lockutils [req-1f9863cc-552d-4dc0-919e-1b6379778acd req-eeb4495a-a2e9-4a0e-a027-e22f21377e5b service nova] Acquired lock "refresh_cache-62954fe5-a462-40bd-85ec-d03b98d2ec42" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 874.474620] env[69648]: DEBUG nova.network.neutron [req-1f9863cc-552d-4dc0-919e-1b6379778acd req-eeb4495a-a2e9-4a0e-a027-e22f21377e5b service nova] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Refreshing network info cache for port a2948875-9ca8-4e4e-8724-545119122a58 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 875.023332] env[69648]: DEBUG nova.network.neutron [req-1f9863cc-552d-4dc0-919e-1b6379778acd req-eeb4495a-a2e9-4a0e-a027-e22f21377e5b service nova] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Updated VIF entry in instance network info cache for port a2948875-9ca8-4e4e-8724-545119122a58. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 875.023725] env[69648]: DEBUG nova.network.neutron [req-1f9863cc-552d-4dc0-919e-1b6379778acd req-eeb4495a-a2e9-4a0e-a027-e22f21377e5b service nova] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Updating instance_info_cache with network_info: [{"id": "a2948875-9ca8-4e4e-8724-545119122a58", "address": "fa:16:3e:06:88:66", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2948875-9c", "ovs_interfaceid": "a2948875-9ca8-4e4e-8724-545119122a58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 875.034517] env[69648]: DEBUG oslo_concurrency.lockutils [req-1f9863cc-552d-4dc0-919e-1b6379778acd req-eeb4495a-a2e9-4a0e-a027-e22f21377e5b service nova] Releasing lock "refresh_cache-62954fe5-a462-40bd-85ec-d03b98d2ec42" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 886.233553] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 886.260987] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 886.261202] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 886.261385] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 887.065335] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 887.065560] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 888.065556] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 888.065796] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 888.065872] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 888.096350] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.096537] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.096800] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.096800] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.096916] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.097056] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.097191] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.097318] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.097441] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.097562] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 888.097686] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 888.098301] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 888.098453] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 888.098620] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 888.114930] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 888.115423] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 888.115622] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 888.115786] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 888.117184] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da0dfa18-febd-4eb5-a15d-ce100edf06b7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.129499] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b42b9084-a0d2-4887-a8d0-5084b610fffb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.145219] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9172bba1-2c9c-4493-91e2-ba91aca8dda2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.152462] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d821947d-79bd-4330-b47c-17085eebcc0c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.186195] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180935MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 888.186814] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 888.186814] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 888.291498] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance dfbb396b-8f18-456d-9064-be451cdd1ac9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.291695] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 20bce654-7f57-4de6-8f7a-c1b34286fc86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.291827] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 928bc799-4fed-4005-89d2-e18196f88ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.291955] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ed63f202-c76d-4492-b738-606ee1c6b059 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.292095] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.292223] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.292345] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.292466] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.292597] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.293032] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 888.304697] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.320218] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.333155] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 401783f7-a434-4c01-8f9a-e3f5fecd10da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.346234] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.358416] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc6cfa72-0132-4bf2-9054-b1064d3e4efb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.374937] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a8a08a83-45f8-43d1-b405-52c751bc2e0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.391779] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0fe09233-c2e0-4d7b-b8df-689df7fdbced has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.405042] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 933d6ead-8da8-43cd-9f02-9373bad0348d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.416344] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3968f404-57ba-4088-b516-eb9c085f6b75 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.427464] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8b1d1227-c6ec-4f6a-8076-b8e4b4efa12a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.438581] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 7532d0c5-20f4-4b64-85f1-e7b16d15acf8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.456076] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f31fc53-7a85-4db2-977a-a02f174c1eca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.468168] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c6473d99-b222-4b4b-8d2d-61876e54dc43 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.480033] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance da62948a-a57e-4a0a-9fad-fc7de9f5f878 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.491200] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b3a99599-9514-4702-b01e-95ccb064ed4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.503408] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 7ca86d88-f679-40bd-a46d-20c39fe13247 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.514375] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0fa3ac73-db70-4034-91da-29e42cefc471 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.526171] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 4f321d17-20ad-49d2-9952-c27e1161f339 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 888.526171] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 888.526416] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 888.891976] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44ea3f99-9094-4914-9bc0-49c8431ead5d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.899765] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5ad1d75-5c4f-4aac-b379-4001b5c0adb3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.932302] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-867fbd0e-232e-4302-9217-4cf9c3120743 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.939947] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3212eeff-c0e4-4632-99c3-e2f219d0f7c9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.955264] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 888.968206] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 888.983434] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 888.983638] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.797s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 890.950909] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 895.335651] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "e64fd474-91ab-449e-8785-e788685ed77a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 895.335952] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "e64fd474-91ab-449e-8785-e788685ed77a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 914.394993] env[69648]: WARNING oslo_vmware.rw_handles [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 914.394993] env[69648]: ERROR oslo_vmware.rw_handles [ 914.395606] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/54d43a57-8d7a-4f57-8637-cbd3d9b45239/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 914.397367] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 914.397622] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Copying Virtual Disk [datastore1] vmware_temp/54d43a57-8d7a-4f57-8637-cbd3d9b45239/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/54d43a57-8d7a-4f57-8637-cbd3d9b45239/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 914.397933] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d556bacb-16dc-4651-abb9-bce7ad1cf5c6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.406554] env[69648]: DEBUG oslo_vmware.api [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Waiting for the task: (returnval){ [ 914.406554] env[69648]: value = "task-3466524" [ 914.406554] env[69648]: _type = "Task" [ 914.406554] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 914.414516] env[69648]: DEBUG oslo_vmware.api [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Task: {'id': task-3466524, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 914.917490] env[69648]: DEBUG oslo_vmware.exceptions [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 914.917777] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 914.918646] env[69648]: ERROR nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 914.918646] env[69648]: Faults: ['InvalidArgument'] [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Traceback (most recent call last): [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] yield resources [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] self.driver.spawn(context, instance, image_meta, [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] self._fetch_image_if_missing(context, vi) [ 914.918646] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] image_cache(vi, tmp_image_ds_loc) [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] vm_util.copy_virtual_disk( [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] session._wait_for_task(vmdk_copy_task) [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] return self.wait_for_task(task_ref) [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] return evt.wait() [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] result = hub.switch() [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 914.919146] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] return self.greenlet.switch() [ 914.919643] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 914.919643] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] self.f(*self.args, **self.kw) [ 914.919643] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 914.919643] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] raise exceptions.translate_fault(task_info.error) [ 914.919643] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 914.919643] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Faults: ['InvalidArgument'] [ 914.919643] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] [ 914.919643] env[69648]: INFO nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Terminating instance [ 914.920650] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 914.920866] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 914.921120] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-86458c7b-c799-4ef0-80da-1a2570af57f5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.923465] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 914.923662] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 914.924408] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64955143-a074-49d7-97fe-aa366d1d2255 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.931323] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 914.931574] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ec47f988-fa38-43b5-b329-bda8fb78a7cb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.933982] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 914.934191] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 914.935181] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1524a848-e135-4982-afa6-ee3f2b844c87 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.940091] env[69648]: DEBUG oslo_vmware.api [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for the task: (returnval){ [ 914.940091] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52b1e1c8-dfe3-5110-982b-de3f864dbc59" [ 914.940091] env[69648]: _type = "Task" [ 914.940091] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 914.947064] env[69648]: DEBUG oslo_vmware.api [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52b1e1c8-dfe3-5110-982b-de3f864dbc59, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 914.993805] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 914.994054] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 914.994227] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Deleting the datastore file [datastore1] dfbb396b-8f18-456d-9064-be451cdd1ac9 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 914.994485] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-10217076-0742-4070-8df6-efef2998018c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.000802] env[69648]: DEBUG oslo_vmware.api [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Waiting for the task: (returnval){ [ 915.000802] env[69648]: value = "task-3466526" [ 915.000802] env[69648]: _type = "Task" [ 915.000802] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 915.009194] env[69648]: DEBUG oslo_vmware.api [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Task: {'id': task-3466526, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 915.452951] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 915.455022] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Creating directory with path [datastore1] vmware_temp/b3770000-3f68-4c71-a7a1-ae17a0760f4f/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 915.455022] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3b8e2b4d-8588-4d87-ab08-748782763807 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.465172] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Created directory with path [datastore1] vmware_temp/b3770000-3f68-4c71-a7a1-ae17a0760f4f/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 915.465275] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Fetch image to [datastore1] vmware_temp/b3770000-3f68-4c71-a7a1-ae17a0760f4f/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 915.465982] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/b3770000-3f68-4c71-a7a1-ae17a0760f4f/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 915.466159] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-902a514c-4505-4ab4-a38a-5b48e3ae7cde {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.473047] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9d1b214-e63a-436e-ad40-5fc72f029266 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.481985] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0415037-c28e-4485-a0b0-4a778f57f5a2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.517204] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23629a93-da76-41d0-b120-f8f2e80efe25 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.524447] env[69648]: DEBUG oslo_vmware.api [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Task: {'id': task-3466526, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082163} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 915.525938] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 915.526198] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 915.526413] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 915.526620] env[69648]: INFO nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 915.528838] env[69648]: DEBUG nova.compute.claims [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 915.529059] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 915.529409] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 915.532020] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-93f741a8-2ca9-4b7d-98ef-1b39f483172a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.551805] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 915.607978] env[69648]: DEBUG oslo_vmware.rw_handles [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b3770000-3f68-4c71-a7a1-ae17a0760f4f/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 915.668212] env[69648]: DEBUG oslo_vmware.rw_handles [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 915.668283] env[69648]: DEBUG oslo_vmware.rw_handles [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b3770000-3f68-4c71-a7a1-ae17a0760f4f/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 915.970504] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-157a60a8-ccf6-4213-8b2a-0aa40f000ace {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.978044] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daa44750-10ba-4398-a153-6218f310cd0a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.008860] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42785f44-5aa9-4807-a3db-069423d0ed7e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.016025] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfd6d49a-d582-49b5-a2a7-12a7b0993654 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.028569] env[69648]: DEBUG nova.compute.provider_tree [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 916.038346] env[69648]: DEBUG nova.scheduler.client.report [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 916.052293] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.523s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 916.052830] env[69648]: ERROR nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 916.052830] env[69648]: Faults: ['InvalidArgument'] [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Traceback (most recent call last): [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] self.driver.spawn(context, instance, image_meta, [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] self._fetch_image_if_missing(context, vi) [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] image_cache(vi, tmp_image_ds_loc) [ 916.052830] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] vm_util.copy_virtual_disk( [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] session._wait_for_task(vmdk_copy_task) [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] return self.wait_for_task(task_ref) [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] return evt.wait() [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] result = hub.switch() [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] return self.greenlet.switch() [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 916.053221] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] self.f(*self.args, **self.kw) [ 916.053796] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 916.053796] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] raise exceptions.translate_fault(task_info.error) [ 916.053796] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 916.053796] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Faults: ['InvalidArgument'] [ 916.053796] env[69648]: ERROR nova.compute.manager [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] [ 916.053796] env[69648]: DEBUG nova.compute.utils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 916.054957] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Build of instance dfbb396b-8f18-456d-9064-be451cdd1ac9 was re-scheduled: A specified parameter was not correct: fileType [ 916.054957] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 916.055354] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 916.055522] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 916.055693] env[69648]: DEBUG nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 916.055858] env[69648]: DEBUG nova.network.neutron [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 916.471667] env[69648]: DEBUG nova.network.neutron [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 916.489026] env[69648]: INFO nova.compute.manager [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Took 0.43 seconds to deallocate network for instance. [ 916.617572] env[69648]: INFO nova.scheduler.client.report [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Deleted allocations for instance dfbb396b-8f18-456d-9064-be451cdd1ac9 [ 916.640022] env[69648]: DEBUG oslo_concurrency.lockutils [None req-31a0f098-8b59-4734-9a3d-83629ba74d06 tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "dfbb396b-8f18-456d-9064-be451cdd1ac9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 329.891s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 916.640774] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "dfbb396b-8f18-456d-9064-be451cdd1ac9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 129.416s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 916.640989] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Acquiring lock "dfbb396b-8f18-456d-9064-be451cdd1ac9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 916.641206] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "dfbb396b-8f18-456d-9064-be451cdd1ac9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 916.641384] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "dfbb396b-8f18-456d-9064-be451cdd1ac9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 916.643613] env[69648]: INFO nova.compute.manager [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Terminating instance [ 916.648378] env[69648]: DEBUG nova.compute.manager [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 916.648378] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 916.648378] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fffeda2a-bd68-4585-937a-35ae0af98675 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.657114] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2f8e1fc-0c23-4b45-8e18-23036f5c27c0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.667653] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 916.689263] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dfbb396b-8f18-456d-9064-be451cdd1ac9 could not be found. [ 916.689889] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 916.689889] env[69648]: INFO nova.compute.manager [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 916.689889] env[69648]: DEBUG oslo.service.loopingcall [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 916.690081] env[69648]: DEBUG nova.compute.manager [-] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 916.690166] env[69648]: DEBUG nova.network.neutron [-] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 916.716754] env[69648]: DEBUG nova.network.neutron [-] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 916.719092] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 916.719358] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 916.720873] env[69648]: INFO nova.compute.claims [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 916.726123] env[69648]: INFO nova.compute.manager [-] [instance: dfbb396b-8f18-456d-9064-be451cdd1ac9] Took 0.04 seconds to deallocate network for instance. [ 916.827185] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41074fbe-5810-4e3b-b63d-74038bfd8dda tempest-ServersTestManualDisk-1983256646 tempest-ServersTestManualDisk-1983256646-project-member] Lock "dfbb396b-8f18-456d-9064-be451cdd1ac9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.186s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 917.082347] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1906af28-a090-47f2-82b2-22c9e3ceacf4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.090188] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48620bbd-5712-405c-ad27-ecba5d5f5e40 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.120028] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfb40391-ab23-488e-884a-376a56fb6686 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.127669] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bef2be5-092a-4b38-93d3-e15808729227 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.142185] env[69648]: DEBUG nova.compute.provider_tree [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 917.153440] env[69648]: DEBUG nova.scheduler.client.report [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 917.168665] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.449s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 917.169082] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 917.203938] env[69648]: DEBUG nova.compute.utils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 917.205516] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 917.205690] env[69648]: DEBUG nova.network.neutron [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 917.215083] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 917.263472] env[69648]: DEBUG nova.policy [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4fa490cb4b014135a1912f927918ba6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd15936ad51c147498487afaf1a48e20d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 917.281171] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 917.308240] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 917.308498] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 917.308657] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 917.308861] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 917.309019] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 917.309253] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 917.309462] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 917.309626] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 917.309794] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 917.309957] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 917.310161] env[69648]: DEBUG nova.virt.hardware [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 917.311030] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96f892f0-f7b2-4826-a984-a4542425d4ac {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.320134] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed67262d-c1ba-4439-9a7b-a3e5862c0a42 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.080671] env[69648]: DEBUG nova.network.neutron [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Successfully created port: 33dcfae2-7348-4e32-87f8-d0b98700c881 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 918.958793] env[69648]: DEBUG nova.compute.manager [req-f35529d7-d95d-4c24-9f11-d0bae3994f30 req-3e9547dc-4b2f-4c78-a046-7ee2bac7c9ed service nova] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Received event network-vif-plugged-33dcfae2-7348-4e32-87f8-d0b98700c881 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 918.958793] env[69648]: DEBUG oslo_concurrency.lockutils [req-f35529d7-d95d-4c24-9f11-d0bae3994f30 req-3e9547dc-4b2f-4c78-a046-7ee2bac7c9ed service nova] Acquiring lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 918.958793] env[69648]: DEBUG oslo_concurrency.lockutils [req-f35529d7-d95d-4c24-9f11-d0bae3994f30 req-3e9547dc-4b2f-4c78-a046-7ee2bac7c9ed service nova] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 918.958793] env[69648]: DEBUG oslo_concurrency.lockutils [req-f35529d7-d95d-4c24-9f11-d0bae3994f30 req-3e9547dc-4b2f-4c78-a046-7ee2bac7c9ed service nova] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 918.959197] env[69648]: DEBUG nova.compute.manager [req-f35529d7-d95d-4c24-9f11-d0bae3994f30 req-3e9547dc-4b2f-4c78-a046-7ee2bac7c9ed service nova] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] No waiting events found dispatching network-vif-plugged-33dcfae2-7348-4e32-87f8-d0b98700c881 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 918.959197] env[69648]: WARNING nova.compute.manager [req-f35529d7-d95d-4c24-9f11-d0bae3994f30 req-3e9547dc-4b2f-4c78-a046-7ee2bac7c9ed service nova] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Received unexpected event network-vif-plugged-33dcfae2-7348-4e32-87f8-d0b98700c881 for instance with vm_state building and task_state spawning. [ 919.039511] env[69648]: DEBUG nova.network.neutron [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Successfully updated port: 33dcfae2-7348-4e32-87f8-d0b98700c881 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 919.058968] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquiring lock "refresh_cache-8e6a4fd6-5f80-476d-9789-adea1be2ae72" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 919.058968] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquired lock "refresh_cache-8e6a4fd6-5f80-476d-9789-adea1be2ae72" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 919.058968] env[69648]: DEBUG nova.network.neutron [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 919.100238] env[69648]: DEBUG nova.network.neutron [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 919.360349] env[69648]: DEBUG nova.network.neutron [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Updating instance_info_cache with network_info: [{"id": "33dcfae2-7348-4e32-87f8-d0b98700c881", "address": "fa:16:3e:08:79:ce", "network": {"id": "8ea7f3e7-ea70-400f-8cf2-6220fb85d97b", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-303029862-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d15936ad51c147498487afaf1a48e20d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ab9e5e6-9bf8-4a8d-91c8-d22148e3d2ee", "external-id": "nsx-vlan-transportzone-401", "segmentation_id": 401, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap33dcfae2-73", "ovs_interfaceid": "33dcfae2-7348-4e32-87f8-d0b98700c881", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 919.375475] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Releasing lock "refresh_cache-8e6a4fd6-5f80-476d-9789-adea1be2ae72" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 919.375803] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Instance network_info: |[{"id": "33dcfae2-7348-4e32-87f8-d0b98700c881", "address": "fa:16:3e:08:79:ce", "network": {"id": "8ea7f3e7-ea70-400f-8cf2-6220fb85d97b", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-303029862-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d15936ad51c147498487afaf1a48e20d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ab9e5e6-9bf8-4a8d-91c8-d22148e3d2ee", "external-id": "nsx-vlan-transportzone-401", "segmentation_id": 401, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap33dcfae2-73", "ovs_interfaceid": "33dcfae2-7348-4e32-87f8-d0b98700c881", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 919.376277] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:08:79:ce', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8ab9e5e6-9bf8-4a8d-91c8-d22148e3d2ee', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '33dcfae2-7348-4e32-87f8-d0b98700c881', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 919.390908] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Creating folder: Project (d15936ad51c147498487afaf1a48e20d). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 919.391588] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e6a6647a-6b01-483d-8c00-3620ac939f2f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.409025] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Created folder: Project (d15936ad51c147498487afaf1a48e20d) in parent group-v692308. [ 919.409025] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Creating folder: Instances. Parent ref: group-v692355. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 919.409025] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-be618cbf-e7cb-4edf-be97-7e655090f0d3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.417250] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Created folder: Instances in parent group-v692355. [ 919.417718] env[69648]: DEBUG oslo.service.loopingcall [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 919.419518] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 919.419518] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-46a1ae15-a883-4c0e-b42c-fb9751504a9f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.442218] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 919.442218] env[69648]: value = "task-3466529" [ 919.442218] env[69648]: _type = "Task" [ 919.442218] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 919.450283] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466529, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 919.949481] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466529, 'name': CreateVM_Task, 'duration_secs': 0.312561} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 919.949672] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 919.950368] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 919.950573] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 919.950856] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 919.951114] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0e11695b-5793-4652-8d15-8db4c92f4152 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.955616] env[69648]: DEBUG oslo_vmware.api [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Waiting for the task: (returnval){ [ 919.955616] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]522e1f5a-9949-2fbd-edbf-1ff8951cd251" [ 919.955616] env[69648]: _type = "Task" [ 919.955616] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 919.965333] env[69648]: DEBUG oslo_vmware.api [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]522e1f5a-9949-2fbd-edbf-1ff8951cd251, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 920.466307] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 920.466695] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 920.466845] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 920.988956] env[69648]: DEBUG nova.compute.manager [req-8d4c1f8c-0773-4686-83a2-7147735c2dbc req-4b634bb6-1f5b-4012-8d8b-0796adc406b6 service nova] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Received event network-changed-33dcfae2-7348-4e32-87f8-d0b98700c881 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 920.989204] env[69648]: DEBUG nova.compute.manager [req-8d4c1f8c-0773-4686-83a2-7147735c2dbc req-4b634bb6-1f5b-4012-8d8b-0796adc406b6 service nova] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Refreshing instance network info cache due to event network-changed-33dcfae2-7348-4e32-87f8-d0b98700c881. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 920.989488] env[69648]: DEBUG oslo_concurrency.lockutils [req-8d4c1f8c-0773-4686-83a2-7147735c2dbc req-4b634bb6-1f5b-4012-8d8b-0796adc406b6 service nova] Acquiring lock "refresh_cache-8e6a4fd6-5f80-476d-9789-adea1be2ae72" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 920.989637] env[69648]: DEBUG oslo_concurrency.lockutils [req-8d4c1f8c-0773-4686-83a2-7147735c2dbc req-4b634bb6-1f5b-4012-8d8b-0796adc406b6 service nova] Acquired lock "refresh_cache-8e6a4fd6-5f80-476d-9789-adea1be2ae72" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 920.989799] env[69648]: DEBUG nova.network.neutron [req-8d4c1f8c-0773-4686-83a2-7147735c2dbc req-4b634bb6-1f5b-4012-8d8b-0796adc406b6 service nova] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Refreshing network info cache for port 33dcfae2-7348-4e32-87f8-d0b98700c881 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 921.322832] env[69648]: DEBUG nova.network.neutron [req-8d4c1f8c-0773-4686-83a2-7147735c2dbc req-4b634bb6-1f5b-4012-8d8b-0796adc406b6 service nova] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Updated VIF entry in instance network info cache for port 33dcfae2-7348-4e32-87f8-d0b98700c881. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 921.323203] env[69648]: DEBUG nova.network.neutron [req-8d4c1f8c-0773-4686-83a2-7147735c2dbc req-4b634bb6-1f5b-4012-8d8b-0796adc406b6 service nova] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Updating instance_info_cache with network_info: [{"id": "33dcfae2-7348-4e32-87f8-d0b98700c881", "address": "fa:16:3e:08:79:ce", "network": {"id": "8ea7f3e7-ea70-400f-8cf2-6220fb85d97b", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-303029862-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d15936ad51c147498487afaf1a48e20d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ab9e5e6-9bf8-4a8d-91c8-d22148e3d2ee", "external-id": "nsx-vlan-transportzone-401", "segmentation_id": 401, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap33dcfae2-73", "ovs_interfaceid": "33dcfae2-7348-4e32-87f8-d0b98700c881", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 921.339981] env[69648]: DEBUG oslo_concurrency.lockutils [req-8d4c1f8c-0773-4686-83a2-7147735c2dbc req-4b634bb6-1f5b-4012-8d8b-0796adc406b6 service nova] Releasing lock "refresh_cache-8e6a4fd6-5f80-476d-9789-adea1be2ae72" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 924.797730] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquiring lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 927.437587] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 927.437861] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 945.066957] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 947.065583] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 947.066398] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 948.065583] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 949.060476] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 949.065196] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 949.065363] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 949.065487] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 949.091075] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.091075] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.091075] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.092168] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.092168] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.092168] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.092168] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.092168] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.092307] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.092307] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 949.092307] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 949.092799] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 949.092945] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 949.093155] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 949.104544] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 949.104778] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 949.104957] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 949.105137] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 949.106206] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-298131a6-42ab-441d-95d8-aa1b9cc802c6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.117517] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18d2432f-2ada-4cd7-852b-a94cf711a003 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.132400] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e612aa2a-300f-49f2-a2ea-c2ab02b41703 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.138976] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b60446ff-6a2e-4c54-ad70-2e383140e5a7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.167991] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180981MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 949.168177] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 949.168378] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 949.242457] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 20bce654-7f57-4de6-8f7a-c1b34286fc86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.242689] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 928bc799-4fed-4005-89d2-e18196f88ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.242863] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ed63f202-c76d-4492-b738-606ee1c6b059 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.242997] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.243135] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.243258] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.243380] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.243498] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.243628] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.243751] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 949.257075] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.270929] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 401783f7-a434-4c01-8f9a-e3f5fecd10da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.283260] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.294962] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc6cfa72-0132-4bf2-9054-b1064d3e4efb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.304908] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a8a08a83-45f8-43d1-b405-52c751bc2e0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.335440] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0fe09233-c2e0-4d7b-b8df-689df7fdbced has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.347692] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 933d6ead-8da8-43cd-9f02-9373bad0348d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.357662] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3968f404-57ba-4088-b516-eb9c085f6b75 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.367447] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8b1d1227-c6ec-4f6a-8076-b8e4b4efa12a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.376497] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 7532d0c5-20f4-4b64-85f1-e7b16d15acf8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.385787] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f31fc53-7a85-4db2-977a-a02f174c1eca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.394633] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c6473d99-b222-4b4b-8d2d-61876e54dc43 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.403470] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance da62948a-a57e-4a0a-9fad-fc7de9f5f878 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.412611] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b3a99599-9514-4702-b01e-95ccb064ed4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.421568] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 7ca86d88-f679-40bd-a46d-20c39fe13247 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.430515] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0fa3ac73-db70-4034-91da-29e42cefc471 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.440125] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 4f321d17-20ad-49d2-9952-c27e1161f339 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.448510] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e64fd474-91ab-449e-8785-e788685ed77a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.457753] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 949.457993] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 949.458191] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 949.770365] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcf08a2d-ba76-47dc-b9ff-55b8d70c2189 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.777661] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03fc4c35-1dc3-4e35-92b1-0ce21ee0000e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.806759] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-892b7647-6442-4eeb-9f28-b3cc84280cd2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.813559] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2eec3a0-d586-4aa1-8374-019d502de808 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.826235] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 949.834524] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 949.847286] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 949.847463] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.679s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 950.820265] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 963.816359] env[69648]: WARNING oslo_vmware.rw_handles [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 963.816359] env[69648]: ERROR oslo_vmware.rw_handles [ 963.816960] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/b3770000-3f68-4c71-a7a1-ae17a0760f4f/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 963.818705] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 963.818964] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Copying Virtual Disk [datastore1] vmware_temp/b3770000-3f68-4c71-a7a1-ae17a0760f4f/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/b3770000-3f68-4c71-a7a1-ae17a0760f4f/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 963.819289] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0d46d9c9-6baa-4b19-8ad7-c31cc1c1de3d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.826819] env[69648]: DEBUG oslo_vmware.api [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for the task: (returnval){ [ 963.826819] env[69648]: value = "task-3466530" [ 963.826819] env[69648]: _type = "Task" [ 963.826819] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 963.835886] env[69648]: DEBUG oslo_vmware.api [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Task: {'id': task-3466530, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 964.339403] env[69648]: DEBUG oslo_vmware.exceptions [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 964.339728] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 964.340499] env[69648]: ERROR nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 964.340499] env[69648]: Faults: ['InvalidArgument'] [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Traceback (most recent call last): [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] yield resources [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] self.driver.spawn(context, instance, image_meta, [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] self._fetch_image_if_missing(context, vi) [ 964.340499] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] image_cache(vi, tmp_image_ds_loc) [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] vm_util.copy_virtual_disk( [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] session._wait_for_task(vmdk_copy_task) [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] return self.wait_for_task(task_ref) [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] return evt.wait() [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] result = hub.switch() [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 964.340819] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] return self.greenlet.switch() [ 964.341198] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 964.341198] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] self.f(*self.args, **self.kw) [ 964.341198] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 964.341198] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] raise exceptions.translate_fault(task_info.error) [ 964.341198] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 964.341198] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Faults: ['InvalidArgument'] [ 964.341198] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] [ 964.341198] env[69648]: INFO nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Terminating instance [ 964.343025] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 964.343255] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 964.344079] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 964.344347] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 964.344642] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-472b2968-b084-4aba-bc1d-67d9ca9388ce {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.347283] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-225bcf37-26d9-4049-b565-35d6e50f853a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.354313] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 964.355446] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5ce911fd-e362-418a-872d-271b87c23cf6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.356978] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 964.357179] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 964.357849] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7c3832be-c990-4d4e-a7b8-2c266d185391 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.363098] env[69648]: DEBUG oslo_vmware.api [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Waiting for the task: (returnval){ [ 964.363098] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52433c55-7d7a-c259-138c-bf846f1adc25" [ 964.363098] env[69648]: _type = "Task" [ 964.363098] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 964.370690] env[69648]: DEBUG oslo_vmware.api [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52433c55-7d7a-c259-138c-bf846f1adc25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 964.424530] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 964.424848] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 964.425382] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Deleting the datastore file [datastore1] 20bce654-7f57-4de6-8f7a-c1b34286fc86 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 964.425471] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cc7bf6ac-ed2e-45ac-bb9c-9159adff8f26 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.432726] env[69648]: DEBUG oslo_vmware.api [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for the task: (returnval){ [ 964.432726] env[69648]: value = "task-3466532" [ 964.432726] env[69648]: _type = "Task" [ 964.432726] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 964.440527] env[69648]: DEBUG oslo_vmware.api [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Task: {'id': task-3466532, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 964.875508] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 964.875799] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Creating directory with path [datastore1] vmware_temp/50e7ef09-0c59-48db-b77f-39c82e78cee5/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 964.876064] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f6087c30-231a-481e-af88-f0d87d7c0ec8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.887272] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Created directory with path [datastore1] vmware_temp/50e7ef09-0c59-48db-b77f-39c82e78cee5/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 964.887527] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Fetch image to [datastore1] vmware_temp/50e7ef09-0c59-48db-b77f-39c82e78cee5/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 964.887801] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/50e7ef09-0c59-48db-b77f-39c82e78cee5/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 964.888551] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-738e9ddf-3873-495b-994d-899a18e70e94 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.895436] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60a13771-1c14-40bb-9b40-bd9864863b06 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.904516] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-434fc1ff-f3db-41d3-85ef-98f119fbc1be {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.940131] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2fe7223-739e-4462-b693-535eb5f0cc34 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.947877] env[69648]: DEBUG oslo_vmware.api [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Task: {'id': task-3466532, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08566} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 964.949315] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 964.949557] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 964.949689] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 964.949855] env[69648]: INFO nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Took 0.61 seconds to destroy the instance on the hypervisor. [ 964.951617] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7cd97c56-b03c-484f-9f6a-fdc64b662e73 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.953476] env[69648]: DEBUG nova.compute.claims [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 964.953656] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 964.953874] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 965.039560] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 965.092552] env[69648]: DEBUG oslo_vmware.rw_handles [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50e7ef09-0c59-48db-b77f-39c82e78cee5/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 965.151253] env[69648]: DEBUG oslo_vmware.rw_handles [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 965.151484] env[69648]: DEBUG oslo_vmware.rw_handles [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50e7ef09-0c59-48db-b77f-39c82e78cee5/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 965.381918] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b64aa4d9-0e73-4ba8-a4cc-c69c6e847054 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.389879] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c36384d9-4739-492c-8676-1f2e98498770 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.419697] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6216f1fa-7396-4d39-ba45-d6543b2c35a1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.426817] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c08eeed2-14eb-4e1a-80fd-f3d0bd0d813d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.440087] env[69648]: DEBUG nova.compute.provider_tree [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 965.450797] env[69648]: DEBUG nova.scheduler.client.report [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 965.465849] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.512s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 965.466076] env[69648]: ERROR nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 965.466076] env[69648]: Faults: ['InvalidArgument'] [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Traceback (most recent call last): [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] self.driver.spawn(context, instance, image_meta, [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] self._fetch_image_if_missing(context, vi) [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] image_cache(vi, tmp_image_ds_loc) [ 965.466076] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] vm_util.copy_virtual_disk( [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] session._wait_for_task(vmdk_copy_task) [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] return self.wait_for_task(task_ref) [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] return evt.wait() [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] result = hub.switch() [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] return self.greenlet.switch() [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 965.466362] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] self.f(*self.args, **self.kw) [ 965.466626] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 965.466626] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] raise exceptions.translate_fault(task_info.error) [ 965.466626] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 965.466626] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Faults: ['InvalidArgument'] [ 965.466626] env[69648]: ERROR nova.compute.manager [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] [ 965.466805] env[69648]: DEBUG nova.compute.utils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 965.468263] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Build of instance 20bce654-7f57-4de6-8f7a-c1b34286fc86 was re-scheduled: A specified parameter was not correct: fileType [ 965.468263] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 965.468630] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 965.468803] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 965.468971] env[69648]: DEBUG nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 965.469151] env[69648]: DEBUG nova.network.neutron [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 966.042536] env[69648]: DEBUG nova.network.neutron [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 966.062142] env[69648]: INFO nova.compute.manager [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: 20bce654-7f57-4de6-8f7a-c1b34286fc86] Took 0.59 seconds to deallocate network for instance. [ 966.166643] env[69648]: INFO nova.scheduler.client.report [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Deleted allocations for instance 20bce654-7f57-4de6-8f7a-c1b34286fc86 [ 966.185736] env[69648]: DEBUG oslo_concurrency.lockutils [None req-774e3793-c4bd-4af9-a671-bea09949a2d7 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "20bce654-7f57-4de6-8f7a-c1b34286fc86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 376.372s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 966.200138] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 966.247826] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 966.248094] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 966.249817] env[69648]: INFO nova.compute.claims [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 966.599994] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae1388bc-0f0c-4dc4-aec3-cdbc05c04831 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 966.607746] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07c76419-b14f-44b7-a307-dfbe914ceb1b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 966.636898] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05c528b5-8cfd-4de2-b855-8c703163d944 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 966.644140] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b242d7bf-f186-4c75-a73f-260a2187154f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 966.657385] env[69648]: DEBUG nova.compute.provider_tree [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 966.665806] env[69648]: DEBUG nova.scheduler.client.report [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 966.680064] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.431s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 966.680064] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 966.713018] env[69648]: DEBUG nova.compute.utils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 966.715577] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 966.715753] env[69648]: DEBUG nova.network.neutron [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 966.724754] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 966.774972] env[69648]: DEBUG nova.policy [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cfdb0a16fa274c3abbc999cc6283e224', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4eec00c770df46a7a2bd2d1078623820', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 966.793466] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 966.821563] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 966.821704] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 966.821844] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 966.822040] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 966.822196] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 966.822440] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 966.822549] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 966.822708] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 966.822872] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 966.823052] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 966.823237] env[69648]: DEBUG nova.virt.hardware [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 966.824092] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eae97bdd-1c64-491f-8686-8084f105e965 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 966.831836] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e06164a0-621f-4ff0-85be-35f6d7f844e1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.207245] env[69648]: DEBUG nova.network.neutron [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Successfully created port: 05f9ce7e-7b7d-4c55-8d7f-913378180728 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 967.952427] env[69648]: DEBUG nova.network.neutron [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Successfully updated port: 05f9ce7e-7b7d-4c55-8d7f-913378180728 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 967.963544] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquiring lock "refresh_cache-60b00251-25fc-483d-88fe-a84165d6a435" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 967.963722] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquired lock "refresh_cache-60b00251-25fc-483d-88fe-a84165d6a435" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 967.963847] env[69648]: DEBUG nova.network.neutron [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 968.021772] env[69648]: DEBUG nova.network.neutron [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 968.142803] env[69648]: DEBUG nova.compute.manager [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Received event network-vif-plugged-05f9ce7e-7b7d-4c55-8d7f-913378180728 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 968.143010] env[69648]: DEBUG oslo_concurrency.lockutils [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] Acquiring lock "60b00251-25fc-483d-88fe-a84165d6a435-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 968.143241] env[69648]: DEBUG oslo_concurrency.lockutils [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] Lock "60b00251-25fc-483d-88fe-a84165d6a435-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 968.143415] env[69648]: DEBUG oslo_concurrency.lockutils [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] Lock "60b00251-25fc-483d-88fe-a84165d6a435-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 968.143588] env[69648]: DEBUG nova.compute.manager [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] No waiting events found dispatching network-vif-plugged-05f9ce7e-7b7d-4c55-8d7f-913378180728 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 968.143756] env[69648]: WARNING nova.compute.manager [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Received unexpected event network-vif-plugged-05f9ce7e-7b7d-4c55-8d7f-913378180728 for instance with vm_state building and task_state spawning. [ 968.143917] env[69648]: DEBUG nova.compute.manager [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Received event network-changed-05f9ce7e-7b7d-4c55-8d7f-913378180728 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 968.144153] env[69648]: DEBUG nova.compute.manager [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Refreshing instance network info cache due to event network-changed-05f9ce7e-7b7d-4c55-8d7f-913378180728. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 968.144365] env[69648]: DEBUG oslo_concurrency.lockutils [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] Acquiring lock "refresh_cache-60b00251-25fc-483d-88fe-a84165d6a435" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 968.275453] env[69648]: DEBUG nova.network.neutron [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Updating instance_info_cache with network_info: [{"id": "05f9ce7e-7b7d-4c55-8d7f-913378180728", "address": "fa:16:3e:bd:da:cc", "network": {"id": "97414936-ba34-4700-85c3-213341973608", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1268769319-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4eec00c770df46a7a2bd2d1078623820", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd8b2b4e-f09d-4af6-9759-d372870e9b5f", "external-id": "nsx-vlan-transportzone-800", "segmentation_id": 800, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05f9ce7e-7b", "ovs_interfaceid": "05f9ce7e-7b7d-4c55-8d7f-913378180728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 968.291383] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Releasing lock "refresh_cache-60b00251-25fc-483d-88fe-a84165d6a435" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 968.291700] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Instance network_info: |[{"id": "05f9ce7e-7b7d-4c55-8d7f-913378180728", "address": "fa:16:3e:bd:da:cc", "network": {"id": "97414936-ba34-4700-85c3-213341973608", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1268769319-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4eec00c770df46a7a2bd2d1078623820", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd8b2b4e-f09d-4af6-9759-d372870e9b5f", "external-id": "nsx-vlan-transportzone-800", "segmentation_id": 800, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05f9ce7e-7b", "ovs_interfaceid": "05f9ce7e-7b7d-4c55-8d7f-913378180728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 968.292016] env[69648]: DEBUG oslo_concurrency.lockutils [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] Acquired lock "refresh_cache-60b00251-25fc-483d-88fe-a84165d6a435" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 968.292207] env[69648]: DEBUG nova.network.neutron [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Refreshing network info cache for port 05f9ce7e-7b7d-4c55-8d7f-913378180728 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 968.293275] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bd:da:cc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fd8b2b4e-f09d-4af6-9759-d372870e9b5f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '05f9ce7e-7b7d-4c55-8d7f-913378180728', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 968.301634] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Creating folder: Project (4eec00c770df46a7a2bd2d1078623820). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 968.302447] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-45048ec8-21fc-4ed2-8363-c234b471c47d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.317209] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Created folder: Project (4eec00c770df46a7a2bd2d1078623820) in parent group-v692308. [ 968.317432] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Creating folder: Instances. Parent ref: group-v692358. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 968.317669] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1a82185a-5f04-4daa-949d-d12b75354830 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.327842] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Created folder: Instances in parent group-v692358. [ 968.327842] env[69648]: DEBUG oslo.service.loopingcall [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 968.327842] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 968.327842] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7bcff9f8-665e-4589-b31d-a652bfe2b79b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.351466] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 968.351466] env[69648]: value = "task-3466535" [ 968.351466] env[69648]: _type = "Task" [ 968.351466] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 968.359814] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466535, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 968.656277] env[69648]: DEBUG nova.network.neutron [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Updated VIF entry in instance network info cache for port 05f9ce7e-7b7d-4c55-8d7f-913378180728. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 968.656548] env[69648]: DEBUG nova.network.neutron [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Updating instance_info_cache with network_info: [{"id": "05f9ce7e-7b7d-4c55-8d7f-913378180728", "address": "fa:16:3e:bd:da:cc", "network": {"id": "97414936-ba34-4700-85c3-213341973608", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1268769319-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4eec00c770df46a7a2bd2d1078623820", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd8b2b4e-f09d-4af6-9759-d372870e9b5f", "external-id": "nsx-vlan-transportzone-800", "segmentation_id": 800, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05f9ce7e-7b", "ovs_interfaceid": "05f9ce7e-7b7d-4c55-8d7f-913378180728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 968.666408] env[69648]: DEBUG oslo_concurrency.lockutils [req-3f9f1d2e-f931-45ec-87bb-c90a43da504d req-8b43e8f8-735a-442c-8aa2-67b95e3c219c service nova] Releasing lock "refresh_cache-60b00251-25fc-483d-88fe-a84165d6a435" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 968.861342] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466535, 'name': CreateVM_Task, 'duration_secs': 0.28439} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 968.861529] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 968.862265] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 968.862431] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 968.862892] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 968.863692] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bf751c72-31e2-4820-a695-f3ae5023b056 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.868293] env[69648]: DEBUG oslo_vmware.api [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Waiting for the task: (returnval){ [ 968.868293] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52df1890-1eb9-c968-45b7-e9b7d3b223fd" [ 968.868293] env[69648]: _type = "Task" [ 968.868293] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 968.876859] env[69648]: DEBUG oslo_vmware.api [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52df1890-1eb9-c968-45b7-e9b7d3b223fd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 969.380129] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 969.380497] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 969.380614] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 991.356889] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquiring lock "d5fb115d-778d-4fc7-a03a-8f5828868a01" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 991.357871] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 993.464395] env[69648]: DEBUG oslo_concurrency.lockutils [None req-197cc09c-0aa3-40e4-8836-648405be5631 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "d19d0e28-8e92-4188-b570-0488fe81ba66" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 993.464395] env[69648]: DEBUG oslo_concurrency.lockutils [None req-197cc09c-0aa3-40e4-8836-648405be5631 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "d19d0e28-8e92-4188-b570-0488fe81ba66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.091266] env[69648]: DEBUG oslo_concurrency.lockutils [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "91fcee48-3466-480d-bf87-bc4de17fbf31" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.607064] env[69648]: DEBUG oslo_concurrency.lockutils [None req-881f9027-949f-4ffe-a349-fe487d85920b tempest-ServersTestFqdnHostnames-1467993018 tempest-ServersTestFqdnHostnames-1467993018-project-member] Acquiring lock "3b09bb16-99c5-457a-aaf6-30f2c4d7dd32" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.607367] env[69648]: DEBUG oslo_concurrency.lockutils [None req-881f9027-949f-4ffe-a349-fe487d85920b tempest-ServersTestFqdnHostnames-1467993018 tempest-ServersTestFqdnHostnames-1467993018-project-member] Lock "3b09bb16-99c5-457a-aaf6-30f2c4d7dd32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1004.106956] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquiring lock "60b00251-25fc-483d-88fe-a84165d6a435" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1006.061524] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1006.089240] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1008.065578] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1009.065677] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1010.060330] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1010.065044] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1010.065206] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1010.065345] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1010.098120] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.098120] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.098120] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.098120] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.098120] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.098496] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.099185] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.099481] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.099736] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.100014] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1010.100262] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1010.100841] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1010.623446] env[69648]: WARNING oslo_vmware.rw_handles [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1010.623446] env[69648]: ERROR oslo_vmware.rw_handles [ 1010.623446] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/50e7ef09-0c59-48db-b77f-39c82e78cee5/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1010.625357] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1010.625677] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Copying Virtual Disk [datastore1] vmware_temp/50e7ef09-0c59-48db-b77f-39c82e78cee5/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/50e7ef09-0c59-48db-b77f-39c82e78cee5/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1010.625904] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-99fbc59e-81aa-424c-a2c6-a41460e7f3f9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.635692] env[69648]: DEBUG oslo_vmware.api [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Waiting for the task: (returnval){ [ 1010.635692] env[69648]: value = "task-3466536" [ 1010.635692] env[69648]: _type = "Task" [ 1010.635692] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1010.645449] env[69648]: DEBUG oslo_vmware.api [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Task: {'id': task-3466536, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1011.065425] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1011.065610] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1011.065806] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1011.092029] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1011.092029] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1011.092029] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1011.092029] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1011.092950] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e3d3a24-9bbf-4156-8d18-feac2341b61e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.101810] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38a63bc2-6fbf-48ec-87c0-c2259c3d83db {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.116100] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c807fd19-6a12-4643-8e23-69cd936ded46 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.122957] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bac1087-4fd0-4ff8-bb18-b5aa41ecd89e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.153110] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180982MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1011.153274] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1011.153505] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1011.163354] env[69648]: DEBUG oslo_vmware.exceptions [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1011.163649] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1011.164204] env[69648]: ERROR nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1011.164204] env[69648]: Faults: ['InvalidArgument'] [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Traceback (most recent call last): [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] yield resources [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] self.driver.spawn(context, instance, image_meta, [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] self._fetch_image_if_missing(context, vi) [ 1011.164204] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] image_cache(vi, tmp_image_ds_loc) [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] vm_util.copy_virtual_disk( [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] session._wait_for_task(vmdk_copy_task) [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] return self.wait_for_task(task_ref) [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] return evt.wait() [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] result = hub.switch() [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1011.164661] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] return self.greenlet.switch() [ 1011.165076] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1011.165076] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] self.f(*self.args, **self.kw) [ 1011.165076] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1011.165076] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] raise exceptions.translate_fault(task_info.error) [ 1011.165076] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1011.165076] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Faults: ['InvalidArgument'] [ 1011.165076] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] [ 1011.165076] env[69648]: INFO nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Terminating instance [ 1011.166041] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1011.166232] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1011.166509] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-14328656-bc51-4a56-a634-0077254aeabb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.168992] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1011.169235] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1011.169978] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d17febb0-962d-4be6-a7b8-5e2403d8eee6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.176613] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1011.176695] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0b57c144-eadc-40da-a14f-a8fbed3f2a77 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.178942] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1011.179139] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1011.180125] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c3495e19-50a8-41db-90f5-8e0f0639bc44 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.188506] env[69648]: DEBUG oslo_vmware.api [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for the task: (returnval){ [ 1011.188506] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5293e2cd-0a90-247b-11f8-750ea5b7a6a7" [ 1011.188506] env[69648]: _type = "Task" [ 1011.188506] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1011.201213] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1011.201456] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Creating directory with path [datastore1] vmware_temp/ab207b94-3c17-439e-b101-8afcaee7acb2/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1011.201672] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b9b1faf8-c4a6-432c-abf4-1435f5ca8b98 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.222018] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Created directory with path [datastore1] vmware_temp/ab207b94-3c17-439e-b101-8afcaee7acb2/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1011.222018] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Fetch image to [datastore1] vmware_temp/ab207b94-3c17-439e-b101-8afcaee7acb2/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1011.222226] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/ab207b94-3c17-439e-b101-8afcaee7acb2/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1011.223206] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2043b51-7d16-4586-be9d-c1e7ddb6d930 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.230409] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb7f2172-e332-49da-a2f5-af9c9f2e56f1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.244828] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0448a10-dd47-4f6c-aad7-22047e5d23aa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.249114] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1011.249328] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1011.249527] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Deleting the datastore file [datastore1] 928bc799-4fed-4005-89d2-e18196f88ffb {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1011.250037] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bbd442a4-0274-46a9-8dfa-c9aeb7f99aa4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.281101] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4a90a80-dd4b-48d2-a553-9d1fb02d08c2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.283848] env[69648]: DEBUG oslo_vmware.api [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Waiting for the task: (returnval){ [ 1011.283848] env[69648]: value = "task-3466538" [ 1011.283848] env[69648]: _type = "Task" [ 1011.283848] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1011.288259] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 928bc799-4fed-4005-89d2-e18196f88ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.288408] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ed63f202-c76d-4492-b738-606ee1c6b059 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.288537] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.288663] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.288786] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.288907] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.289041] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.289166] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.289282] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.289395] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1011.294344] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-161e9692-2b45-4a41-91b9-350c2e1abb44 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.297660] env[69648]: DEBUG oslo_vmware.api [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Task: {'id': task-3466538, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1011.302701] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.316727] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc6cfa72-0132-4bf2-9054-b1064d3e4efb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.320350] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1011.330749] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a8a08a83-45f8-43d1-b405-52c751bc2e0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.344291] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0fe09233-c2e0-4d7b-b8df-689df7fdbced has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.358018] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 933d6ead-8da8-43cd-9f02-9373bad0348d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.375437] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3968f404-57ba-4088-b516-eb9c085f6b75 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.390034] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8b1d1227-c6ec-4f6a-8076-b8e4b4efa12a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.401280] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 7532d0c5-20f4-4b64-85f1-e7b16d15acf8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.409298] env[69648]: DEBUG oslo_vmware.rw_handles [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ab207b94-3c17-439e-b101-8afcaee7acb2/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1011.466703] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f31fc53-7a85-4db2-977a-a02f174c1eca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.469937] env[69648]: DEBUG oslo_vmware.rw_handles [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1011.470129] env[69648]: DEBUG oslo_vmware.rw_handles [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ab207b94-3c17-439e-b101-8afcaee7acb2/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1011.478822] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c6473d99-b222-4b4b-8d2d-61876e54dc43 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.489571] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance da62948a-a57e-4a0a-9fad-fc7de9f5f878 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.503596] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b3a99599-9514-4702-b01e-95ccb064ed4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.513516] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 7ca86d88-f679-40bd-a46d-20c39fe13247 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.524937] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0fa3ac73-db70-4034-91da-29e42cefc471 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.534989] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 4f321d17-20ad-49d2-9952-c27e1161f339 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.545224] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e64fd474-91ab-449e-8785-e788685ed77a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.555305] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.568044] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.578810] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d19d0e28-8e92-4188-b570-0488fe81ba66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.590520] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3b09bb16-99c5-457a-aaf6-30f2c4d7dd32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1011.590800] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1011.590955] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1011.798390] env[69648]: DEBUG oslo_vmware.api [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Task: {'id': task-3466538, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.104226} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1011.798647] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1011.798836] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1011.799015] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1011.803076] env[69648]: INFO nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1011.805027] env[69648]: DEBUG nova.compute.claims [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1011.805217] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1012.004135] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6a55483-b92d-4035-a285-c8c0db493275 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.011637] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d161b17e-ede4-40fe-864d-326ed41443fc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.043167] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba0bb17c-5400-451e-a53e-d893fd46a8ec {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.050725] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57534441-3558-43a4-a1f4-dcfca1adb609 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.064056] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1012.077598] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1012.170838] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1012.171133] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.018s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1012.171385] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.366s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1012.696116] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd256850-7760-4d67-b99a-8d6f89238ddf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.705376] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22bec45b-28f4-4abf-bb43-b75e96c3c672 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.740145] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-005daf3e-e506-47e9-964e-4bf81f1d1ed3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.748567] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d755bde-b5f1-4ef9-9c2e-adc50703f69f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.768216] env[69648]: DEBUG nova.compute.provider_tree [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1012.785052] env[69648]: DEBUG nova.scheduler.client.report [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1012.814674] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.643s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1012.815438] env[69648]: ERROR nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1012.815438] env[69648]: Faults: ['InvalidArgument'] [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Traceback (most recent call last): [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] self.driver.spawn(context, instance, image_meta, [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] self._fetch_image_if_missing(context, vi) [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] image_cache(vi, tmp_image_ds_loc) [ 1012.815438] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] vm_util.copy_virtual_disk( [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] session._wait_for_task(vmdk_copy_task) [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] return self.wait_for_task(task_ref) [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] return evt.wait() [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] result = hub.switch() [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] return self.greenlet.switch() [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1012.815803] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] self.f(*self.args, **self.kw) [ 1012.816254] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1012.816254] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] raise exceptions.translate_fault(task_info.error) [ 1012.816254] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1012.816254] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Faults: ['InvalidArgument'] [ 1012.816254] env[69648]: ERROR nova.compute.manager [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] [ 1012.816254] env[69648]: DEBUG nova.compute.utils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1012.817626] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Build of instance 928bc799-4fed-4005-89d2-e18196f88ffb was re-scheduled: A specified parameter was not correct: fileType [ 1012.817626] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1012.818033] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1012.818826] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1012.818826] env[69648]: DEBUG nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1012.818826] env[69648]: DEBUG nova.network.neutron [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1013.175865] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1013.656337] env[69648]: DEBUG nova.network.neutron [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1013.703320] env[69648]: INFO nova.compute.manager [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Took 0.88 seconds to deallocate network for instance. [ 1013.952076] env[69648]: INFO nova.scheduler.client.report [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Deleted allocations for instance 928bc799-4fed-4005-89d2-e18196f88ffb [ 1014.004687] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1609621-7524-4731-9c13-6781fd2fc028 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "928bc799-4fed-4005-89d2-e18196f88ffb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 424.042s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1014.004687] env[69648]: DEBUG oslo_concurrency.lockutils [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "928bc799-4fed-4005-89d2-e18196f88ffb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 224.645s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1014.004687] env[69648]: DEBUG oslo_concurrency.lockutils [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Acquiring lock "928bc799-4fed-4005-89d2-e18196f88ffb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1014.004897] env[69648]: DEBUG oslo_concurrency.lockutils [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "928bc799-4fed-4005-89d2-e18196f88ffb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1014.004897] env[69648]: DEBUG oslo_concurrency.lockutils [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "928bc799-4fed-4005-89d2-e18196f88ffb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1014.006026] env[69648]: INFO nova.compute.manager [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Terminating instance [ 1014.007704] env[69648]: DEBUG nova.compute.manager [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1014.007910] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1014.008401] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c6758667-a3a9-42ee-9730-0e78628b4e5c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.017536] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddfbfd7b-7c61-410d-93e6-a0dc8748ac34 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.032540] env[69648]: DEBUG nova.compute.manager [None req-3e643fd3-12c1-4144-9159-1bde5f48c2dd tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] [instance: 401783f7-a434-4c01-8f9a-e3f5fecd10da] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1014.054540] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 928bc799-4fed-4005-89d2-e18196f88ffb could not be found. [ 1014.054785] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1014.054966] env[69648]: INFO nova.compute.manager [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1014.055250] env[69648]: DEBUG oslo.service.loopingcall [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1014.055582] env[69648]: DEBUG nova.compute.manager [-] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1014.055688] env[69648]: DEBUG nova.network.neutron [-] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1014.068641] env[69648]: DEBUG nova.compute.manager [None req-3e643fd3-12c1-4144-9159-1bde5f48c2dd tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] [instance: 401783f7-a434-4c01-8f9a-e3f5fecd10da] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1014.086122] env[69648]: DEBUG nova.network.neutron [-] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1014.091809] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3e643fd3-12c1-4144-9159-1bde5f48c2dd tempest-ServersTestMultiNic-1461458840 tempest-ServersTestMultiNic-1461458840-project-member] Lock "401783f7-a434-4c01-8f9a-e3f5fecd10da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.804s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1014.094254] env[69648]: INFO nova.compute.manager [-] [instance: 928bc799-4fed-4005-89d2-e18196f88ffb] Took 0.04 seconds to deallocate network for instance. [ 1014.104432] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1014.235140] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1014.235479] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1014.237462] env[69648]: INFO nova.compute.claims [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1014.296492] env[69648]: DEBUG oslo_concurrency.lockutils [None req-958c5787-7649-40f1-a385-702c7bb73e06 tempest-InstanceActionsNegativeTestJSON-643380256 tempest-InstanceActionsNegativeTestJSON-643380256-project-member] Lock "928bc799-4fed-4005-89d2-e18196f88ffb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.293s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1014.684029] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83776e8e-1fcb-4ce0-b092-331e5d6c845a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.690672] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff1e79a8-5f61-4787-bb70-a829dd1906ab {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.723538] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce2a3d2a-93ac-410c-b8dd-16361c6d484a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.734597] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaef7efb-c7bf-4583-9624-91173b0cbb54 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.748624] env[69648]: DEBUG nova.compute.provider_tree [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1014.762529] env[69648]: DEBUG nova.scheduler.client.report [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1014.797252] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.560s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1014.797252] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1014.862024] env[69648]: DEBUG nova.compute.utils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1014.862222] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Not allocating networking since 'none' was specified. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1014.879056] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1015.317831] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1015.378761] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1015.379021] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1015.379195] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1015.379383] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1015.379530] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1015.379684] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1015.379935] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1015.380126] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1015.380298] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1015.380462] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1015.380637] env[69648]: DEBUG nova.virt.hardware [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1015.381578] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a536854c-8a37-4024-89e1-638255365214 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.390165] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b596a19-77dc-484e-8bf3-f58f2f1c4293 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.403288] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Instance VIF info [] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1015.408841] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Creating folder: Project (2ea5725e82f146c1a040556c33b542bb). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1015.409136] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b436b1af-426e-4293-9c13-a07097d1ab30 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.418390] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Created folder: Project (2ea5725e82f146c1a040556c33b542bb) in parent group-v692308. [ 1015.418621] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Creating folder: Instances. Parent ref: group-v692361. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1015.418790] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-37543446-a43e-4cf2-9415-532fc0053b2d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.429016] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Created folder: Instances in parent group-v692361. [ 1015.429016] env[69648]: DEBUG oslo.service.loopingcall [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1015.429016] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1015.429016] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1f341142-38d7-4d25-8e0d-6a38e0e21b4e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.445128] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1015.445128] env[69648]: value = "task-3466541" [ 1015.445128] env[69648]: _type = "Task" [ 1015.445128] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1015.453042] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466541, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1015.955423] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466541, 'name': CreateVM_Task, 'duration_secs': 0.29606} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1015.955594] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1015.956022] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1015.956196] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1015.956510] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1015.956807] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6e3cda08-9b44-433e-94ae-787c50221f9d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.963959] env[69648]: DEBUG oslo_vmware.api [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Waiting for the task: (returnval){ [ 1015.963959] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52ba2c64-5053-bf7e-466c-422e7493ff04" [ 1015.963959] env[69648]: _type = "Task" [ 1015.963959] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1015.972741] env[69648]: DEBUG oslo_vmware.api [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52ba2c64-5053-bf7e-466c-422e7493ff04, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1016.478078] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1016.478390] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1016.478582] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1016.819579] env[69648]: DEBUG oslo_concurrency.lockutils [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "6062dd02-230d-42bc-8304-fc122f1f1489" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1025.952245] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquiring lock "ab839f84-b864-409e-883d-00dddb5db3db" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1025.956233] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "ab839f84-b864-409e-883d-00dddb5db3db" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1029.547074] env[69648]: DEBUG oslo_concurrency.lockutils [None req-376ef2f5-d682-435b-aaa5-e8afa8c1c078 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "ea075f2f-4f2d-4b1f-a6cd-e125b6554d24" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1029.547074] env[69648]: DEBUG oslo_concurrency.lockutils [None req-376ef2f5-d682-435b-aaa5-e8afa8c1c078 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "ea075f2f-4f2d-4b1f-a6cd-e125b6554d24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1038.368995] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a13e343a-bdb6-47df-97e0-430a30cb050e tempest-ServerTagsTestJSON-1019271742 tempest-ServerTagsTestJSON-1019271742-project-member] Acquiring lock "09d8d722-d63e-4675-bd53-7862c677424d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1038.369286] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a13e343a-bdb6-47df-97e0-430a30cb050e tempest-ServerTagsTestJSON-1019271742 tempest-ServerTagsTestJSON-1019271742-project-member] Lock "09d8d722-d63e-4675-bd53-7862c677424d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1040.070785] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e09a69ac-f613-4e7a-a503-2634acf964c7 tempest-AttachInterfacesUnderV243Test-278770139 tempest-AttachInterfacesUnderV243Test-278770139-project-member] Acquiring lock "93b15196-95be-471b-ab26-193e23e163ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1040.070785] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e09a69ac-f613-4e7a-a503-2634acf964c7 tempest-AttachInterfacesUnderV243Test-278770139 tempest-AttachInterfacesUnderV243Test-278770139-project-member] Lock "93b15196-95be-471b-ab26-193e23e163ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1042.862846] env[69648]: DEBUG oslo_concurrency.lockutils [None req-18c9d602-d624-4cf3-99fd-52f314023991 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] Acquiring lock "163ce80a-d23b-43ea-8d19-a93fdce9e552" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1042.863867] env[69648]: DEBUG oslo_concurrency.lockutils [None req-18c9d602-d624-4cf3-99fd-52f314023991 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] Lock "163ce80a-d23b-43ea-8d19-a93fdce9e552" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1045.162654] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d686f58-4968-4e49-b376-be3880882d15 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "468e1b4a-8701-4413-a836-b8377877ccf1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1045.162929] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d686f58-4968-4e49-b376-be3880882d15 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "468e1b4a-8701-4413-a836-b8377877ccf1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1058.584560] env[69648]: WARNING oslo_vmware.rw_handles [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1058.584560] env[69648]: ERROR oslo_vmware.rw_handles [ 1058.585175] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/ab207b94-3c17-439e-b101-8afcaee7acb2/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1058.587082] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1058.587361] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Copying Virtual Disk [datastore1] vmware_temp/ab207b94-3c17-439e-b101-8afcaee7acb2/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/ab207b94-3c17-439e-b101-8afcaee7acb2/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1058.587657] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1c02c7ec-c84b-4d76-bfb3-ab450629e470 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.595767] env[69648]: DEBUG oslo_vmware.api [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for the task: (returnval){ [ 1058.595767] env[69648]: value = "task-3466542" [ 1058.595767] env[69648]: _type = "Task" [ 1058.595767] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1058.604135] env[69648]: DEBUG oslo_vmware.api [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Task: {'id': task-3466542, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1059.106162] env[69648]: DEBUG oslo_vmware.exceptions [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1059.106457] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1059.107009] env[69648]: ERROR nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1059.107009] env[69648]: Faults: ['InvalidArgument'] [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Traceback (most recent call last): [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] yield resources [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] self.driver.spawn(context, instance, image_meta, [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] self._fetch_image_if_missing(context, vi) [ 1059.107009] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] image_cache(vi, tmp_image_ds_loc) [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] vm_util.copy_virtual_disk( [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] session._wait_for_task(vmdk_copy_task) [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] return self.wait_for_task(task_ref) [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] return evt.wait() [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] result = hub.switch() [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1059.107323] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] return self.greenlet.switch() [ 1059.107684] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1059.107684] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] self.f(*self.args, **self.kw) [ 1059.107684] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1059.107684] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] raise exceptions.translate_fault(task_info.error) [ 1059.107684] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1059.107684] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Faults: ['InvalidArgument'] [ 1059.107684] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] [ 1059.107684] env[69648]: INFO nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Terminating instance [ 1059.108883] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1059.109104] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1059.109339] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9be0a8d0-3d34-4d1a-8594-fe044cdbcc83 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.112226] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1059.112411] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1059.113117] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1d41f62-35dd-4f77-aecf-787ae4363597 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.119612] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1059.119836] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c938cace-93b1-4c9c-9d61-af00e6334b07 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.121963] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1059.122150] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1059.123086] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-47372d99-7ab8-4956-9e7d-7314be07b2ac {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.128828] env[69648]: DEBUG oslo_vmware.api [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 1059.128828] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]523e7a47-6c86-3919-8d01-06bf1c5248d5" [ 1059.128828] env[69648]: _type = "Task" [ 1059.128828] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1059.135483] env[69648]: DEBUG oslo_vmware.api [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]523e7a47-6c86-3919-8d01-06bf1c5248d5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1059.184671] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1059.184866] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1059.185059] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Deleting the datastore file [datastore1] ed63f202-c76d-4492-b738-606ee1c6b059 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1059.185329] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1f7bcdf0-896d-4954-a4a6-e6ac6a248446 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.190915] env[69648]: DEBUG oslo_vmware.api [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for the task: (returnval){ [ 1059.190915] env[69648]: value = "task-3466544" [ 1059.190915] env[69648]: _type = "Task" [ 1059.190915] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1059.198299] env[69648]: DEBUG oslo_vmware.api [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Task: {'id': task-3466544, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1059.639153] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1059.639430] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating directory with path [datastore1] vmware_temp/5ba1cc85-d805-417d-8917-2057332ae62c/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1059.639657] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-76788c49-9a34-40a9-8633-60d0ab6893fc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.652035] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Created directory with path [datastore1] vmware_temp/5ba1cc85-d805-417d-8917-2057332ae62c/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1059.652262] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Fetch image to [datastore1] vmware_temp/5ba1cc85-d805-417d-8917-2057332ae62c/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1059.652439] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/5ba1cc85-d805-417d-8917-2057332ae62c/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1059.653189] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f38ab46-ad8e-4a7d-b86c-e407527bdb58 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.660036] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6340905f-a81e-4ebd-8702-46dbbfa6b4e5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.671189] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22fc9286-f46b-434d-bfc5-a7fe47eca60d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.718637] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bba5068-19b4-45ab-8f2c-c1357c227e43 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.728123] env[69648]: DEBUG oslo_vmware.api [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Task: {'id': task-3466544, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081189} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1059.730131] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1059.730394] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1059.730642] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1059.730881] env[69648]: INFO nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1059.733652] env[69648]: DEBUG nova.compute.claims [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1059.733889] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1059.734194] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.738864] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-856062c5-1821-4bcf-9e7c-0e5bda15a955 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.760659] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1059.811302] env[69648]: DEBUG oslo_vmware.rw_handles [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5ba1cc85-d805-417d-8917-2057332ae62c/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1059.871723] env[69648]: DEBUG oslo_vmware.rw_handles [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1059.871936] env[69648]: DEBUG oslo_vmware.rw_handles [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5ba1cc85-d805-417d-8917-2057332ae62c/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1060.080484] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5103b10-125e-46c9-9a64-f9a43853cb58 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.087603] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cd58851-97d5-4eba-ad8a-703cf1e39224 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.117043] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f293f6ce-0916-4012-a510-917fc677dcaa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.124139] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-572bc818-aa7b-41c6-8d4f-89e21a8222fd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.137094] env[69648]: DEBUG nova.compute.provider_tree [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1060.146383] env[69648]: DEBUG nova.scheduler.client.report [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1060.161014] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.427s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.161557] env[69648]: ERROR nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1060.161557] env[69648]: Faults: ['InvalidArgument'] [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Traceback (most recent call last): [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] self.driver.spawn(context, instance, image_meta, [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] self._fetch_image_if_missing(context, vi) [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] image_cache(vi, tmp_image_ds_loc) [ 1060.161557] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] vm_util.copy_virtual_disk( [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] session._wait_for_task(vmdk_copy_task) [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] return self.wait_for_task(task_ref) [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] return evt.wait() [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] result = hub.switch() [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] return self.greenlet.switch() [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1060.161868] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] self.f(*self.args, **self.kw) [ 1060.162172] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1060.162172] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] raise exceptions.translate_fault(task_info.error) [ 1060.162172] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1060.162172] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Faults: ['InvalidArgument'] [ 1060.162172] env[69648]: ERROR nova.compute.manager [instance: ed63f202-c76d-4492-b738-606ee1c6b059] [ 1060.162293] env[69648]: DEBUG nova.compute.utils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1060.163689] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Build of instance ed63f202-c76d-4492-b738-606ee1c6b059 was re-scheduled: A specified parameter was not correct: fileType [ 1060.163689] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1060.164085] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1060.164264] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1060.164448] env[69648]: DEBUG nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1060.164610] env[69648]: DEBUG nova.network.neutron [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1060.469334] env[69648]: DEBUG nova.network.neutron [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1060.487272] env[69648]: INFO nova.compute.manager [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Took 0.32 seconds to deallocate network for instance. [ 1060.605017] env[69648]: INFO nova.scheduler.client.report [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Deleted allocations for instance ed63f202-c76d-4492-b738-606ee1c6b059 [ 1060.631608] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a80ee337-b4a0-4221-b29c-c3f8ab79448c tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "ed63f202-c76d-4492-b738-606ee1c6b059" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 465.932s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.632904] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "ed63f202-c76d-4492-b738-606ee1c6b059" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 265.131s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1060.633170] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Acquiring lock "ed63f202-c76d-4492-b738-606ee1c6b059-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1060.633427] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "ed63f202-c76d-4492-b738-606ee1c6b059-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1060.633637] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "ed63f202-c76d-4492-b738-606ee1c6b059-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.635905] env[69648]: INFO nova.compute.manager [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Terminating instance [ 1060.640852] env[69648]: DEBUG nova.compute.manager [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1060.641105] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1060.641380] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b3d0797b-13ee-4c81-9456-6d3c60cb5c7f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.652458] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8b13c33-d092-4669-b7fc-22b69ef0f429 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.663805] env[69648]: DEBUG nova.compute.manager [None req-ee9717d6-9875-49d7-8ff5-2c5a9d52f95d tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: fc6cfa72-0132-4bf2-9054-b1064d3e4efb] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1060.684290] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ed63f202-c76d-4492-b738-606ee1c6b059 could not be found. [ 1060.684503] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1060.684683] env[69648]: INFO nova.compute.manager [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1060.684922] env[69648]: DEBUG oslo.service.loopingcall [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1060.685149] env[69648]: DEBUG nova.compute.manager [-] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1060.685247] env[69648]: DEBUG nova.network.neutron [-] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1060.690180] env[69648]: DEBUG nova.compute.manager [None req-ee9717d6-9875-49d7-8ff5-2c5a9d52f95d tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: fc6cfa72-0132-4bf2-9054-b1064d3e4efb] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1060.714740] env[69648]: DEBUG nova.network.neutron [-] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1060.717249] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ee9717d6-9875-49d7-8ff5-2c5a9d52f95d tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "fc6cfa72-0132-4bf2-9054-b1064d3e4efb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 244.029s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.724238] env[69648]: INFO nova.compute.manager [-] [instance: ed63f202-c76d-4492-b738-606ee1c6b059] Took 0.04 seconds to deallocate network for instance. [ 1060.729176] env[69648]: DEBUG nova.compute.manager [None req-00385dc7-2d7d-432c-9eee-af054dcc92b8 tempest-FloatingIPsAssociationTestJSON-1510815455 tempest-FloatingIPsAssociationTestJSON-1510815455-project-member] [instance: a8a08a83-45f8-43d1-b405-52c751bc2e0a] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1060.752385] env[69648]: DEBUG nova.compute.manager [None req-00385dc7-2d7d-432c-9eee-af054dcc92b8 tempest-FloatingIPsAssociationTestJSON-1510815455 tempest-FloatingIPsAssociationTestJSON-1510815455-project-member] [instance: a8a08a83-45f8-43d1-b405-52c751bc2e0a] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1060.776445] env[69648]: DEBUG oslo_concurrency.lockutils [None req-00385dc7-2d7d-432c-9eee-af054dcc92b8 tempest-FloatingIPsAssociationTestJSON-1510815455 tempest-FloatingIPsAssociationTestJSON-1510815455-project-member] Lock "a8a08a83-45f8-43d1-b405-52c751bc2e0a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.896s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.788962] env[69648]: DEBUG nova.compute.manager [None req-304fa61d-a4cb-4079-a7be-da3360b5863a tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 0fe09233-c2e0-4d7b-b8df-689df7fdbced] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1060.820035] env[69648]: DEBUG nova.compute.manager [None req-304fa61d-a4cb-4079-a7be-da3360b5863a tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] [instance: 0fe09233-c2e0-4d7b-b8df-689df7fdbced] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1060.827181] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3fcb5fd4-816f-4eeb-b57f-8ed6d6db3045 tempest-ServersAdminTestJSON-1847130028 tempest-ServersAdminTestJSON-1847130028-project-member] Lock "ed63f202-c76d-4492-b738-606ee1c6b059" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.194s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.846488] env[69648]: DEBUG oslo_concurrency.lockutils [None req-304fa61d-a4cb-4079-a7be-da3360b5863a tempest-MigrationsAdminTest-1278809001 tempest-MigrationsAdminTest-1278809001-project-member] Lock "0fe09233-c2e0-4d7b-b8df-689df7fdbced" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.520s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.879290] env[69648]: DEBUG nova.compute.manager [None req-744bd89d-dce4-4b00-a344-e54ae2f4cafe tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: 933d6ead-8da8-43cd-9f02-9373bad0348d] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1060.907167] env[69648]: DEBUG nova.compute.manager [None req-744bd89d-dce4-4b00-a344-e54ae2f4cafe tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: 933d6ead-8da8-43cd-9f02-9373bad0348d] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1060.931014] env[69648]: DEBUG oslo_concurrency.lockutils [None req-744bd89d-dce4-4b00-a344-e54ae2f4cafe tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "933d6ead-8da8-43cd-9f02-9373bad0348d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.800s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.941044] env[69648]: DEBUG nova.compute.manager [None req-e9c449a6-6ea0-4713-b79b-77f0a4370b50 tempest-ServersNegativeTestMultiTenantJSON-629123672 tempest-ServersNegativeTestMultiTenantJSON-629123672-project-member] [instance: 3968f404-57ba-4088-b516-eb9c085f6b75] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1060.963949] env[69648]: DEBUG nova.compute.manager [None req-e9c449a6-6ea0-4713-b79b-77f0a4370b50 tempest-ServersNegativeTestMultiTenantJSON-629123672 tempest-ServersNegativeTestMultiTenantJSON-629123672-project-member] [instance: 3968f404-57ba-4088-b516-eb9c085f6b75] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1060.983739] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e9c449a6-6ea0-4713-b79b-77f0a4370b50 tempest-ServersNegativeTestMultiTenantJSON-629123672 tempest-ServersNegativeTestMultiTenantJSON-629123672-project-member] Lock "3968f404-57ba-4088-b516-eb9c085f6b75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.381s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.991673] env[69648]: DEBUG nova.compute.manager [None req-943063f8-f51e-4ac6-b5d4-936fd78123e0 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] [instance: 8b1d1227-c6ec-4f6a-8076-b8e4b4efa12a] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.016592] env[69648]: DEBUG nova.compute.manager [None req-943063f8-f51e-4ac6-b5d4-936fd78123e0 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] [instance: 8b1d1227-c6ec-4f6a-8076-b8e4b4efa12a] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1061.036982] env[69648]: DEBUG oslo_concurrency.lockutils [None req-943063f8-f51e-4ac6-b5d4-936fd78123e0 tempest-SecurityGroupsTestJSON-1260615493 tempest-SecurityGroupsTestJSON-1260615493-project-member] Lock "8b1d1227-c6ec-4f6a-8076-b8e4b4efa12a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.202s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.045665] env[69648]: DEBUG nova.compute.manager [None req-e40de771-b591-44fb-941e-c55309602b66 tempest-AttachInterfacesV270Test-364647713 tempest-AttachInterfacesV270Test-364647713-project-member] [instance: 7532d0c5-20f4-4b64-85f1-e7b16d15acf8] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.067863] env[69648]: DEBUG nova.compute.manager [None req-e40de771-b591-44fb-941e-c55309602b66 tempest-AttachInterfacesV270Test-364647713 tempest-AttachInterfacesV270Test-364647713-project-member] [instance: 7532d0c5-20f4-4b64-85f1-e7b16d15acf8] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1061.090920] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e40de771-b591-44fb-941e-c55309602b66 tempest-AttachInterfacesV270Test-364647713 tempest-AttachInterfacesV270Test-364647713-project-member] Lock "7532d0c5-20f4-4b64-85f1-e7b16d15acf8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.853s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.099152] env[69648]: DEBUG nova.compute.manager [None req-1fef200c-e96d-4a17-8671-cbe43faaf80d tempest-ServerAddressesNegativeTestJSON-699051968 tempest-ServerAddressesNegativeTestJSON-699051968-project-member] [instance: 6f31fc53-7a85-4db2-977a-a02f174c1eca] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.122164] env[69648]: DEBUG nova.compute.manager [None req-1fef200c-e96d-4a17-8671-cbe43faaf80d tempest-ServerAddressesNegativeTestJSON-699051968 tempest-ServerAddressesNegativeTestJSON-699051968-project-member] [instance: 6f31fc53-7a85-4db2-977a-a02f174c1eca] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1061.146014] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1fef200c-e96d-4a17-8671-cbe43faaf80d tempest-ServerAddressesNegativeTestJSON-699051968 tempest-ServerAddressesNegativeTestJSON-699051968-project-member] Lock "6f31fc53-7a85-4db2-977a-a02f174c1eca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.811s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.154748] env[69648]: DEBUG nova.compute.manager [None req-4f95580b-76a3-47e0-973f-54e35879eac8 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] [instance: c6473d99-b222-4b4b-8d2d-61876e54dc43] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.199064] env[69648]: DEBUG nova.compute.manager [None req-4f95580b-76a3-47e0-973f-54e35879eac8 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] [instance: c6473d99-b222-4b4b-8d2d-61876e54dc43] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1061.220833] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4f95580b-76a3-47e0-973f-54e35879eac8 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] Lock "c6473d99-b222-4b4b-8d2d-61876e54dc43" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.239s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.231561] env[69648]: DEBUG nova.compute.manager [None req-70a2bcf3-7710-44b7-9547-1bd373d1a7d9 tempest-ServerRescueTestJSONUnderV235-182724449 tempest-ServerRescueTestJSONUnderV235-182724449-project-member] [instance: da62948a-a57e-4a0a-9fad-fc7de9f5f878] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.256944] env[69648]: DEBUG nova.compute.manager [None req-70a2bcf3-7710-44b7-9547-1bd373d1a7d9 tempest-ServerRescueTestJSONUnderV235-182724449 tempest-ServerRescueTestJSONUnderV235-182724449-project-member] [instance: da62948a-a57e-4a0a-9fad-fc7de9f5f878] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1061.276837] env[69648]: DEBUG oslo_concurrency.lockutils [None req-70a2bcf3-7710-44b7-9547-1bd373d1a7d9 tempest-ServerRescueTestJSONUnderV235-182724449 tempest-ServerRescueTestJSONUnderV235-182724449-project-member] Lock "da62948a-a57e-4a0a-9fad-fc7de9f5f878" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.216s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.285439] env[69648]: DEBUG nova.compute.manager [None req-9497bfe8-fa77-428c-8856-f25356d886a1 tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] [instance: b3a99599-9514-4702-b01e-95ccb064ed4a] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.306881] env[69648]: DEBUG nova.compute.manager [None req-9497bfe8-fa77-428c-8856-f25356d886a1 tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] [instance: b3a99599-9514-4702-b01e-95ccb064ed4a] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1061.326906] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9497bfe8-fa77-428c-8856-f25356d886a1 tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] Lock "b3a99599-9514-4702-b01e-95ccb064ed4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.956s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.334922] env[69648]: DEBUG nova.compute.manager [None req-43b0abd1-b7e8-48ef-b369-b97aec3aee5e tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] [instance: 7ca86d88-f679-40bd-a46d-20c39fe13247] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.356170] env[69648]: DEBUG nova.compute.manager [None req-43b0abd1-b7e8-48ef-b369-b97aec3aee5e tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] [instance: 7ca86d88-f679-40bd-a46d-20c39fe13247] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1061.375287] env[69648]: DEBUG oslo_concurrency.lockutils [None req-43b0abd1-b7e8-48ef-b369-b97aec3aee5e tempest-ServerRescueNegativeTestJSON-258122650 tempest-ServerRescueNegativeTestJSON-258122650-project-member] Lock "7ca86d88-f679-40bd-a46d-20c39fe13247" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.637s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.384035] env[69648]: DEBUG nova.compute.manager [None req-cd14b2a8-a587-4a25-b3cd-dcfae09d49ed tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 0fa3ac73-db70-4034-91da-29e42cefc471] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.406154] env[69648]: DEBUG nova.compute.manager [None req-cd14b2a8-a587-4a25-b3cd-dcfae09d49ed tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 0fa3ac73-db70-4034-91da-29e42cefc471] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1061.425707] env[69648]: DEBUG oslo_concurrency.lockutils [None req-cd14b2a8-a587-4a25-b3cd-dcfae09d49ed tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "0fa3ac73-db70-4034-91da-29e42cefc471" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.247s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.433725] env[69648]: DEBUG nova.compute.manager [None req-e80aef8e-65f1-435d-b5e0-21ee3ca34dcf tempest-ServerActionsV293TestJSON-1131934826 tempest-ServerActionsV293TestJSON-1131934826-project-member] [instance: 4f321d17-20ad-49d2-9952-c27e1161f339] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.456178] env[69648]: DEBUG nova.compute.manager [None req-e80aef8e-65f1-435d-b5e0-21ee3ca34dcf tempest-ServerActionsV293TestJSON-1131934826 tempest-ServerActionsV293TestJSON-1131934826-project-member] [instance: 4f321d17-20ad-49d2-9952-c27e1161f339] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1061.475947] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e80aef8e-65f1-435d-b5e0-21ee3ca34dcf tempest-ServerActionsV293TestJSON-1131934826 tempest-ServerActionsV293TestJSON-1131934826-project-member] Lock "4f321d17-20ad-49d2-9952-c27e1161f339" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.267s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.486746] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1061.539883] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.540154] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1061.541667] env[69648]: INFO nova.compute.claims [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1061.884770] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4647ac00-99db-4e94-a573-5d82d4a3e863 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.892497] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a98deae5-44ef-4969-8e3f-001f51800aa2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.922155] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a17e79a-6e58-4fbe-8597-08a0e81f0616 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.929971] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa5a7854-8cb1-42a0-a17b-e1348d1316ac {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.943401] env[69648]: DEBUG nova.compute.provider_tree [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1061.952511] env[69648]: DEBUG nova.scheduler.client.report [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1061.971486] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.431s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.972112] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1062.007350] env[69648]: DEBUG nova.compute.utils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1062.008970] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Not allocating networking since 'none' was specified. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1062.018582] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1062.092277] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1062.121685] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1062.121967] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1062.122139] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1062.122324] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1062.122470] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1062.122615] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1062.122824] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1062.122981] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1062.123161] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1062.123324] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1062.123493] env[69648]: DEBUG nova.virt.hardware [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1062.124355] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6157174f-4000-488e-8533-f13c937488b5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.132587] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c40c50e5-830d-4dc6-8869-a7dfd869a677 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.146024] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Instance VIF info [] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1062.151358] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Creating folder: Project (33845d990c794d27b86b488ec60805ed). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1062.151605] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dca765b6-48ff-4a14-be64-2a919964b2e6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.161060] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Created folder: Project (33845d990c794d27b86b488ec60805ed) in parent group-v692308. [ 1062.161230] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Creating folder: Instances. Parent ref: group-v692364. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1062.161428] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ba6abc83-1f8b-42de-9cbd-4b7cc117f38a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.170239] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Created folder: Instances in parent group-v692364. [ 1062.170447] env[69648]: DEBUG oslo.service.loopingcall [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1062.170667] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1062.170855] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-838b0280-26bc-40d0-b76b-c60e9d93eff7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.186118] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1062.186118] env[69648]: value = "task-3466547" [ 1062.186118] env[69648]: _type = "Task" [ 1062.186118] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1062.193017] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466547, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1062.696017] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466547, 'name': CreateVM_Task, 'duration_secs': 0.243036} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1062.696852] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1062.696852] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1062.696852] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1062.697038] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1062.697284] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-55bf774d-1c19-4d27-a628-839d2b67f6b1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.701709] env[69648]: DEBUG oslo_vmware.api [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Waiting for the task: (returnval){ [ 1062.701709] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528eb3ac-afd0-cf19-c055-ba623cad0675" [ 1062.701709] env[69648]: _type = "Task" [ 1062.701709] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1062.709038] env[69648]: DEBUG oslo_vmware.api [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528eb3ac-afd0-cf19-c055-ba623cad0675, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1063.211671] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1063.211947] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1063.212143] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1066.066165] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1070.060715] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1070.067489] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1070.067714] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1071.067573] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1071.067854] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1071.067888] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1071.095974] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.096157] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.096294] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.096421] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.096549] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.096673] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.096821] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.096911] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.097040] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.097164] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1071.097287] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1071.097776] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1072.066767] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1072.066767] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1072.066767] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1072.066767] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1072.079772] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1072.079772] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1072.079772] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.079772] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1072.082101] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea46cff-442a-459a-9946-8fa5e9cb0517 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.089615] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-222bd6ef-e58b-417e-bdc9-36783a675735 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.105246] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed071a52-db03-44e5-abfc-3f6b6df708b6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.110775] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abbf066f-0b79-4c61-8aea-4cd896b69ada {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.141442] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180996MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1072.141597] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1072.142813] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1072.213894] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.214076] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.214203] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.214330] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.214453] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.214572] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.214691] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.214808] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.214924] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.215131] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e64fd474-91ab-449e-8785-e788685ed77a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1072.227357] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.239656] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.255680] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d19d0e28-8e92-4188-b570-0488fe81ba66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.268962] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3b09bb16-99c5-457a-aaf6-30f2c4d7dd32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.280293] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ab839f84-b864-409e-883d-00dddb5db3db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.290880] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ea075f2f-4f2d-4b1f-a6cd-e125b6554d24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.302317] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 09d8d722-d63e-4675-bd53-7862c677424d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.314135] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 93b15196-95be-471b-ab26-193e23e163ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.324598] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 163ce80a-d23b-43ea-8d19-a93fdce9e552 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.338440] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 468e1b4a-8701-4413-a836-b8377877ccf1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1072.338722] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1072.338875] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1072.655051] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-346cb794-5c7c-4c1d-8110-ba24cdd7a1a7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.668497] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03b1a488-afbc-4ab2-90dd-f60855099e77 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.700630] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac46dee5-4299-45c9-8735-c0efdb08b14c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.708209] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-162295d9-0a22-47f1-87d0-7fa121c2c240 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.721198] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1072.731895] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1072.752561] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1072.753638] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1090.747191] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "e64fd474-91ab-449e-8785-e788685ed77a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1105.659192] env[69648]: WARNING oslo_vmware.rw_handles [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1105.659192] env[69648]: ERROR oslo_vmware.rw_handles [ 1105.659918] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/5ba1cc85-d805-417d-8917-2057332ae62c/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1105.661768] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1105.662029] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Copying Virtual Disk [datastore1] vmware_temp/5ba1cc85-d805-417d-8917-2057332ae62c/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/5ba1cc85-d805-417d-8917-2057332ae62c/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1105.662337] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5255730b-f6b1-484f-b5e2-c4a55e191602 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.670254] env[69648]: DEBUG oslo_vmware.api [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 1105.670254] env[69648]: value = "task-3466548" [ 1105.670254] env[69648]: _type = "Task" [ 1105.670254] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1105.678316] env[69648]: DEBUG oslo_vmware.api [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': task-3466548, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1106.181007] env[69648]: DEBUG oslo_vmware.exceptions [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1106.181420] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1106.181925] env[69648]: ERROR nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1106.181925] env[69648]: Faults: ['InvalidArgument'] [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Traceback (most recent call last): [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] yield resources [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] self.driver.spawn(context, instance, image_meta, [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] self._fetch_image_if_missing(context, vi) [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1106.181925] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] image_cache(vi, tmp_image_ds_loc) [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] vm_util.copy_virtual_disk( [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] session._wait_for_task(vmdk_copy_task) [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] return self.wait_for_task(task_ref) [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] return evt.wait() [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] result = hub.switch() [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] return self.greenlet.switch() [ 1106.182351] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1106.182684] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] self.f(*self.args, **self.kw) [ 1106.182684] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1106.182684] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] raise exceptions.translate_fault(task_info.error) [ 1106.182684] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1106.182684] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Faults: ['InvalidArgument'] [ 1106.182684] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] [ 1106.182684] env[69648]: INFO nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Terminating instance [ 1106.183853] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1106.184207] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1106.184338] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6466957a-30f0-4576-9019-38d8f76fea18 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.186703] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1106.186893] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1106.187607] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8abb8f8-34c3-492b-acad-7a053c37fcc9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.194347] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1106.194558] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-13a67260-7cb5-4ba1-abfb-31f0d9eb2787 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.196755] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1106.196928] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1106.197862] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b6de1eec-1aef-45b7-9eff-a1c81f93218a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.203633] env[69648]: DEBUG oslo_vmware.api [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 1106.203633] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52fe9302-169f-5fb6-4bc3-97fe7a4ef070" [ 1106.203633] env[69648]: _type = "Task" [ 1106.203633] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1106.210568] env[69648]: DEBUG oslo_vmware.api [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52fe9302-169f-5fb6-4bc3-97fe7a4ef070, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1106.261333] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1106.261459] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1106.261580] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Deleting the datastore file [datastore1] 1756fcf7-3d68-4d02-9a66-619d0a1a9505 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1106.261836] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1899e95e-9012-4ddc-9a62-526d0d57c6f6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.268526] env[69648]: DEBUG oslo_vmware.api [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 1106.268526] env[69648]: value = "task-3466550" [ 1106.268526] env[69648]: _type = "Task" [ 1106.268526] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1106.275954] env[69648]: DEBUG oslo_vmware.api [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': task-3466550, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1106.716066] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1106.716066] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating directory with path [datastore1] vmware_temp/e92bd85a-3c51-4155-ab0c-e337e99c739e/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1106.716066] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3c15ecaa-dcf8-4d52-9bf5-56e9ebf49e99 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.726165] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Created directory with path [datastore1] vmware_temp/e92bd85a-3c51-4155-ab0c-e337e99c739e/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1106.726349] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Fetch image to [datastore1] vmware_temp/e92bd85a-3c51-4155-ab0c-e337e99c739e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1106.726528] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/e92bd85a-3c51-4155-ab0c-e337e99c739e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1106.727225] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eacd4624-a8a9-4f38-9975-da0eacb35c5f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.733711] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d48e1467-f330-42a4-8896-995220383c03 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.742302] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcd25ec3-95fb-4658-b2dd-6d863757a2b6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.775718] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dbc6e15-b7c4-4dbd-8fe8-b905f908e03c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.782894] env[69648]: DEBUG oslo_vmware.api [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': task-3466550, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071266} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1106.783934] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1106.784143] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1106.784320] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1106.784492] env[69648]: INFO nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1106.786630] env[69648]: DEBUG nova.compute.claims [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1106.786804] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1106.787079] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1106.789639] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a543cea2-fec3-48c1-8172-0db5948d868e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.809822] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1106.871488] env[69648]: DEBUG oslo_vmware.rw_handles [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e92bd85a-3c51-4155-ab0c-e337e99c739e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1106.932483] env[69648]: DEBUG oslo_vmware.rw_handles [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1106.932720] env[69648]: DEBUG oslo_vmware.rw_handles [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e92bd85a-3c51-4155-ab0c-e337e99c739e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1107.119037] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-849ccf32-ad2c-4bab-b446-2d1fc4f56a52 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.128276] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ffc2547-a0cc-4a0f-b745-40d68e064d7e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.158448] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edcdbb0a-8538-43ae-8d51-9eae64f64521 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.166259] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d478ef2-bc3c-436d-aede-64d18980ffd9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.179395] env[69648]: DEBUG nova.compute.provider_tree [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1107.188309] env[69648]: DEBUG nova.scheduler.client.report [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1107.205407] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.418s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.205975] env[69648]: ERROR nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1107.205975] env[69648]: Faults: ['InvalidArgument'] [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Traceback (most recent call last): [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] self.driver.spawn(context, instance, image_meta, [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] self._fetch_image_if_missing(context, vi) [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] image_cache(vi, tmp_image_ds_loc) [ 1107.205975] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] vm_util.copy_virtual_disk( [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] session._wait_for_task(vmdk_copy_task) [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] return self.wait_for_task(task_ref) [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] return evt.wait() [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] result = hub.switch() [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] return self.greenlet.switch() [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1107.206304] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] self.f(*self.args, **self.kw) [ 1107.206727] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1107.206727] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] raise exceptions.translate_fault(task_info.error) [ 1107.206727] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1107.206727] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Faults: ['InvalidArgument'] [ 1107.206727] env[69648]: ERROR nova.compute.manager [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] [ 1107.206727] env[69648]: DEBUG nova.compute.utils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1107.208412] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Build of instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 was re-scheduled: A specified parameter was not correct: fileType [ 1107.208412] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1107.208889] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1107.209143] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1107.209412] env[69648]: DEBUG nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1107.209657] env[69648]: DEBUG nova.network.neutron [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1107.601930] env[69648]: DEBUG nova.network.neutron [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1107.615727] env[69648]: INFO nova.compute.manager [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Took 0.41 seconds to deallocate network for instance. [ 1107.704837] env[69648]: INFO nova.scheduler.client.report [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Deleted allocations for instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 [ 1107.723063] env[69648]: DEBUG oslo_concurrency.lockutils [None req-29a50fe6-cba6-4ea0-b218-c49be93576e6 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 511.446s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.724174] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 311.251s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.724392] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1107.724591] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.724775] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.726745] env[69648]: INFO nova.compute.manager [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Terminating instance [ 1107.728385] env[69648]: DEBUG nova.compute.manager [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1107.728687] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1107.729108] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a262fd09-e9bb-4dcf-9971-f2c8f7456ded {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.739225] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-347c0fa6-ec57-4aa4-8abe-87224694ff88 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.750609] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1107.772159] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1756fcf7-3d68-4d02-9a66-619d0a1a9505 could not be found. [ 1107.772372] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1107.772552] env[69648]: INFO nova.compute.manager [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1107.772791] env[69648]: DEBUG oslo.service.loopingcall [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1107.773018] env[69648]: DEBUG nova.compute.manager [-] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1107.773122] env[69648]: DEBUG nova.network.neutron [-] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1107.800161] env[69648]: DEBUG nova.network.neutron [-] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1107.806225] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1107.806461] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.807927] env[69648]: INFO nova.compute.claims [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1107.811088] env[69648]: INFO nova.compute.manager [-] [instance: 1756fcf7-3d68-4d02-9a66-619d0a1a9505] Took 0.04 seconds to deallocate network for instance. [ 1107.901247] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1c938329-2ffd-496b-9bda-a6fb6671b66f tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "1756fcf7-3d68-4d02-9a66-619d0a1a9505" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.177s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1108.089641] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d5613d4-4dab-4578-9b4c-f0e8fda850ca {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.097265] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f26b3ba-f4da-4830-8656-cfd63588b526 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.128505] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c87dd456-634f-4610-9f11-b6a32188b97d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.136016] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bced38c9-2c94-4e37-85ba-623496e128f0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.150407] env[69648]: DEBUG nova.compute.provider_tree [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1108.159728] env[69648]: DEBUG nova.scheduler.client.report [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1108.174575] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1108.175094] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1108.211638] env[69648]: DEBUG nova.compute.utils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1108.213142] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1108.213142] env[69648]: DEBUG nova.network.neutron [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1108.222458] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1108.271426] env[69648]: DEBUG nova.policy [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cc2dddbbaacd46a2b666a91962b7b61b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5dd14788cf484723b237d19251d169b9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1108.287789] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1108.312354] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1108.312354] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1108.312547] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1108.312661] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1108.312846] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1108.312956] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1108.313166] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1108.313326] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1108.313496] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1108.313920] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1108.313920] env[69648]: DEBUG nova.virt.hardware [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1108.314660] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be2ab6da-ac29-4f24-8ab1-cd5c8cfa3195 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.322464] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b85a5d4c-f18a-4c68-bdb3-7addf062358d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.644452] env[69648]: DEBUG nova.network.neutron [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Successfully created port: 49af927a-0bd6-4a7d-99dd-2fcae4ae04ec {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1109.449399] env[69648]: DEBUG nova.network.neutron [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Successfully updated port: 49af927a-0bd6-4a7d-99dd-2fcae4ae04ec {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1109.462599] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "refresh_cache-fc2f697a-9f8c-4de1-a9a9-8606118663d7" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1109.462753] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquired lock "refresh_cache-fc2f697a-9f8c-4de1-a9a9-8606118663d7" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1109.462908] env[69648]: DEBUG nova.network.neutron [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1109.500471] env[69648]: DEBUG nova.network.neutron [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1109.731609] env[69648]: DEBUG nova.network.neutron [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Updating instance_info_cache with network_info: [{"id": "49af927a-0bd6-4a7d-99dd-2fcae4ae04ec", "address": "fa:16:3e:9a:e2:60", "network": {"id": "9646bb87-1573-4497-a44e-2918d0bc9a18", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1280380194-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5dd14788cf484723b237d19251d169b9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bed837fa-6b6a-4192-a229-a99426a46065", "external-id": "nsx-vlan-transportzone-954", "segmentation_id": 954, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap49af927a-0b", "ovs_interfaceid": "49af927a-0bd6-4a7d-99dd-2fcae4ae04ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1109.739876] env[69648]: DEBUG nova.compute.manager [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Received event network-vif-plugged-49af927a-0bd6-4a7d-99dd-2fcae4ae04ec {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1109.740129] env[69648]: DEBUG oslo_concurrency.lockutils [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] Acquiring lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1109.740389] env[69648]: DEBUG oslo_concurrency.lockutils [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1109.740588] env[69648]: DEBUG oslo_concurrency.lockutils [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1109.740759] env[69648]: DEBUG nova.compute.manager [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] No waiting events found dispatching network-vif-plugged-49af927a-0bd6-4a7d-99dd-2fcae4ae04ec {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1109.740922] env[69648]: WARNING nova.compute.manager [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Received unexpected event network-vif-plugged-49af927a-0bd6-4a7d-99dd-2fcae4ae04ec for instance with vm_state building and task_state spawning. [ 1109.741370] env[69648]: DEBUG nova.compute.manager [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Received event network-changed-49af927a-0bd6-4a7d-99dd-2fcae4ae04ec {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1109.741530] env[69648]: DEBUG nova.compute.manager [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Refreshing instance network info cache due to event network-changed-49af927a-0bd6-4a7d-99dd-2fcae4ae04ec. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1109.741713] env[69648]: DEBUG oslo_concurrency.lockutils [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] Acquiring lock "refresh_cache-fc2f697a-9f8c-4de1-a9a9-8606118663d7" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1109.743114] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Releasing lock "refresh_cache-fc2f697a-9f8c-4de1-a9a9-8606118663d7" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1109.743409] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Instance network_info: |[{"id": "49af927a-0bd6-4a7d-99dd-2fcae4ae04ec", "address": "fa:16:3e:9a:e2:60", "network": {"id": "9646bb87-1573-4497-a44e-2918d0bc9a18", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1280380194-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5dd14788cf484723b237d19251d169b9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bed837fa-6b6a-4192-a229-a99426a46065", "external-id": "nsx-vlan-transportzone-954", "segmentation_id": 954, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap49af927a-0b", "ovs_interfaceid": "49af927a-0bd6-4a7d-99dd-2fcae4ae04ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1109.743680] env[69648]: DEBUG oslo_concurrency.lockutils [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] Acquired lock "refresh_cache-fc2f697a-9f8c-4de1-a9a9-8606118663d7" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1109.743848] env[69648]: DEBUG nova.network.neutron [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Refreshing network info cache for port 49af927a-0bd6-4a7d-99dd-2fcae4ae04ec {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1109.744891] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9a:e2:60', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bed837fa-6b6a-4192-a229-a99426a46065', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '49af927a-0bd6-4a7d-99dd-2fcae4ae04ec', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1109.753147] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Creating folder: Project (5dd14788cf484723b237d19251d169b9). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1109.754287] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-73c89335-a770-4544-84be-da40ce8ce6d8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.767272] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Created folder: Project (5dd14788cf484723b237d19251d169b9) in parent group-v692308. [ 1109.767457] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Creating folder: Instances. Parent ref: group-v692367. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1109.767674] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-03f6efc4-da0e-4838-9c8c-6a1c35bd3984 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.775813] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Created folder: Instances in parent group-v692367. [ 1109.776049] env[69648]: DEBUG oslo.service.loopingcall [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1109.776224] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1109.776411] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aed4e177-c4bc-4e87-a669-37cb9951a173 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.797306] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1109.797306] env[69648]: value = "task-3466553" [ 1109.797306] env[69648]: _type = "Task" [ 1109.797306] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1109.804527] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466553, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1110.019946] env[69648]: DEBUG nova.network.neutron [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Updated VIF entry in instance network info cache for port 49af927a-0bd6-4a7d-99dd-2fcae4ae04ec. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1110.020605] env[69648]: DEBUG nova.network.neutron [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Updating instance_info_cache with network_info: [{"id": "49af927a-0bd6-4a7d-99dd-2fcae4ae04ec", "address": "fa:16:3e:9a:e2:60", "network": {"id": "9646bb87-1573-4497-a44e-2918d0bc9a18", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1280380194-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5dd14788cf484723b237d19251d169b9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bed837fa-6b6a-4192-a229-a99426a46065", "external-id": "nsx-vlan-transportzone-954", "segmentation_id": 954, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap49af927a-0b", "ovs_interfaceid": "49af927a-0bd6-4a7d-99dd-2fcae4ae04ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1110.031921] env[69648]: DEBUG oslo_concurrency.lockutils [req-e171e925-7ec8-49d4-86da-a3ec8502a263 req-b5127df0-7e1a-449f-a20c-6f95dd509a5a service nova] Releasing lock "refresh_cache-fc2f697a-9f8c-4de1-a9a9-8606118663d7" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1110.306865] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466553, 'name': CreateVM_Task, 'duration_secs': 0.298567} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1110.307051] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1110.307706] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1110.307873] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1110.308272] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1110.308448] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e78745bc-7382-4399-8c31-75352f58fb3c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.312721] env[69648]: DEBUG oslo_vmware.api [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for the task: (returnval){ [ 1110.312721] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c2f3bb-228f-8b5a-b430-81c2f1594d85" [ 1110.312721] env[69648]: _type = "Task" [ 1110.312721] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1110.320084] env[69648]: DEBUG oslo_vmware.api [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c2f3bb-228f-8b5a-b430-81c2f1594d85, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1110.823237] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1110.823531] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1110.823705] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1123.065845] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1123.366934] env[69648]: DEBUG oslo_concurrency.lockutils [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1126.075409] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1128.274737] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1128.274737] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1128.463405] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "a924bdee-1e16-4d78-ac6b-9574677de55f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1128.463702] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "a924bdee-1e16-4d78-ac6b-9574677de55f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1130.060520] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1131.064880] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1131.065189] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1131.065319] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1132.065471] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1132.065842] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1132.065842] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1132.085281] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.085434] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.085568] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.085697] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.085823] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.085947] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.086084] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.086209] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.086330] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.086450] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1132.086572] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1132.087048] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1133.065641] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1133.065962] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1134.065605] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1134.065862] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1134.078046] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.078260] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1134.078402] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1134.078548] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1134.079682] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da113905-8db6-47e4-b9f8-1c8c2032d3be {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.092041] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19060a3a-e868-4903-a484-8a76140254c4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.108047] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5eec16c-a547-450f-90ab-01da6f3d0051 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.114486] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9cb92df-b622-45c1-add8-e10ab700d216 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.143742] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180981MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1134.143897] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.144105] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1134.296014] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.296196] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.296334] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.296458] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.296579] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.296698] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.296860] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.296991] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.297132] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e64fd474-91ab-449e-8785-e788685ed77a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.297262] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1134.308909] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.319810] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d19d0e28-8e92-4188-b570-0488fe81ba66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.330089] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3b09bb16-99c5-457a-aaf6-30f2c4d7dd32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.340170] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ab839f84-b864-409e-883d-00dddb5db3db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.351023] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ea075f2f-4f2d-4b1f-a6cd-e125b6554d24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.363136] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 09d8d722-d63e-4675-bd53-7862c677424d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.373351] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 93b15196-95be-471b-ab26-193e23e163ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.384111] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 163ce80a-d23b-43ea-8d19-a93fdce9e552 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.396546] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 468e1b4a-8701-4413-a836-b8377877ccf1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.406736] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.417541] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1134.417775] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1134.417927] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1134.433514] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing inventories for resource provider d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1134.448507] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating ProviderTree inventory for provider d38a352b-7808-44da-8216-792e96aadc88 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1134.448690] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating inventory in ProviderTree for provider d38a352b-7808-44da-8216-792e96aadc88 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1134.459057] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing aggregate associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, aggregates: None {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1134.475651] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing trait associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1134.707427] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78a0a1e7-93a9-48af-a1c1-e420a8b0eb45 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.713840] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16351baa-7625-4a73-81f4-9e83972d0690 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.742918] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5235f5e7-69a6-4c12-9fc8-083f71a1bc5c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.749885] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6a470d2-3fd1-4ce7-992a-65099048bf65 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.764308] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1134.774215] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1134.810862] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1134.811065] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1136.065328] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1136.065675] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 1136.076540] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] There are 0 instances to clean {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 1136.076859] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1136.077047] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances with incomplete migration {{(pid=69648) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 1153.883130] env[69648]: WARNING oslo_vmware.rw_handles [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1153.883130] env[69648]: ERROR oslo_vmware.rw_handles [ 1153.883788] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/e92bd85a-3c51-4155-ab0c-e337e99c739e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1153.885569] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1153.885836] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Copying Virtual Disk [datastore1] vmware_temp/e92bd85a-3c51-4155-ab0c-e337e99c739e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/e92bd85a-3c51-4155-ab0c-e337e99c739e/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1153.886137] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2cdb6453-8b97-4284-921e-1172fdd5fa37 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.895590] env[69648]: DEBUG oslo_vmware.api [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 1153.895590] env[69648]: value = "task-3466554" [ 1153.895590] env[69648]: _type = "Task" [ 1153.895590] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1153.902927] env[69648]: DEBUG oslo_vmware.api [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': task-3466554, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1154.405499] env[69648]: DEBUG oslo_vmware.exceptions [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1154.405796] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1154.406361] env[69648]: ERROR nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1154.406361] env[69648]: Faults: ['InvalidArgument'] [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Traceback (most recent call last): [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] yield resources [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] self.driver.spawn(context, instance, image_meta, [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] self._fetch_image_if_missing(context, vi) [ 1154.406361] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] image_cache(vi, tmp_image_ds_loc) [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] vm_util.copy_virtual_disk( [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] session._wait_for_task(vmdk_copy_task) [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] return self.wait_for_task(task_ref) [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] return evt.wait() [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] result = hub.switch() [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1154.406709] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] return self.greenlet.switch() [ 1154.407067] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1154.407067] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] self.f(*self.args, **self.kw) [ 1154.407067] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1154.407067] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] raise exceptions.translate_fault(task_info.error) [ 1154.407067] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1154.407067] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Faults: ['InvalidArgument'] [ 1154.407067] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] [ 1154.407067] env[69648]: INFO nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Terminating instance [ 1154.408325] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1154.408658] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1154.409017] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3a095df4-8e65-4ca2-a926-10f98def811a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.411650] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1154.411844] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1154.412617] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19bd69e8-f936-4dd2-82c9-33bc0470029e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.419777] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1154.419866] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ad3779b6-bc8f-414d-b33f-8cfd09a09dc9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.422085] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1154.422273] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1154.423218] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-21cdeeca-d8d4-42f4-932b-e25b332d43a0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.427893] env[69648]: DEBUG oslo_vmware.api [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 1154.427893] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52490789-9eae-36f1-e0c0-f8467294ee9a" [ 1154.427893] env[69648]: _type = "Task" [ 1154.427893] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1154.436932] env[69648]: DEBUG oslo_vmware.api [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52490789-9eae-36f1-e0c0-f8467294ee9a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1154.489138] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1154.489385] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1154.489592] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Deleting the datastore file [datastore1] 642ba6f1-b912-4f55-9199-9c98b58ffc1b {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1154.489882] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6e551017-b329-4820-8395-fa7bf0af14b5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.497962] env[69648]: DEBUG oslo_vmware.api [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 1154.497962] env[69648]: value = "task-3466556" [ 1154.497962] env[69648]: _type = "Task" [ 1154.497962] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1154.505631] env[69648]: DEBUG oslo_vmware.api [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': task-3466556, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1154.938645] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1154.938964] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating directory with path [datastore1] vmware_temp/c445a913-7f50-4edf-baad-37e02b96e4b0/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1154.939146] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-caad21df-5054-4b79-9172-40710dd55a14 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.950756] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Created directory with path [datastore1] vmware_temp/c445a913-7f50-4edf-baad-37e02b96e4b0/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1154.950936] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Fetch image to [datastore1] vmware_temp/c445a913-7f50-4edf-baad-37e02b96e4b0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1154.951123] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/c445a913-7f50-4edf-baad-37e02b96e4b0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1154.951849] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f5ce4d7-a7fe-4bb9-a2da-93000b9a7b36 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.958124] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c795308e-2fba-4d4f-94c8-aeec42e44099 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.966845] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f925f0f-b07c-49bc-868b-63a121951c14 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.998274] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-402a612a-51a1-4db8-a952-b7165dac4516 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.008564] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-07855423-ee4b-4301-b12a-a0fc4a040e80 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.010183] env[69648]: DEBUG oslo_vmware.api [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': task-3466556, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.094295} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1155.010413] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1155.010587] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1155.010756] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1155.010928] env[69648]: INFO nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1155.012979] env[69648]: DEBUG nova.compute.claims [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1155.013186] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1155.013413] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1155.031715] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1155.081441] env[69648]: DEBUG oslo_vmware.rw_handles [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c445a913-7f50-4edf-baad-37e02b96e4b0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1155.139467] env[69648]: DEBUG oslo_vmware.rw_handles [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1155.139648] env[69648]: DEBUG oslo_vmware.rw_handles [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c445a913-7f50-4edf-baad-37e02b96e4b0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1155.325776] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0416b7b5-65b8-478d-ba01-1f44611efa6c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.333362] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd9ee2fe-122d-4a6d-a426-3fd59969a425 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.362549] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ae21172-e5f1-421e-975a-ea6e9e9fb914 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.369034] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a403493-b144-4c93-9fca-c8046fbd5961 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.381549] env[69648]: DEBUG nova.compute.provider_tree [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1155.391286] env[69648]: DEBUG nova.scheduler.client.report [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1155.404792] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.391s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.405286] env[69648]: ERROR nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1155.405286] env[69648]: Faults: ['InvalidArgument'] [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Traceback (most recent call last): [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] self.driver.spawn(context, instance, image_meta, [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] self._fetch_image_if_missing(context, vi) [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] image_cache(vi, tmp_image_ds_loc) [ 1155.405286] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] vm_util.copy_virtual_disk( [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] session._wait_for_task(vmdk_copy_task) [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] return self.wait_for_task(task_ref) [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] return evt.wait() [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] result = hub.switch() [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] return self.greenlet.switch() [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1155.405615] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] self.f(*self.args, **self.kw) [ 1155.405996] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1155.405996] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] raise exceptions.translate_fault(task_info.error) [ 1155.405996] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1155.405996] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Faults: ['InvalidArgument'] [ 1155.405996] env[69648]: ERROR nova.compute.manager [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] [ 1155.405996] env[69648]: DEBUG nova.compute.utils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1155.407243] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Build of instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b was re-scheduled: A specified parameter was not correct: fileType [ 1155.407243] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1155.407611] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1155.407781] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1155.407950] env[69648]: DEBUG nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1155.408126] env[69648]: DEBUG nova.network.neutron [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1155.761317] env[69648]: DEBUG nova.network.neutron [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1155.780036] env[69648]: INFO nova.compute.manager [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 642ba6f1-b912-4f55-9199-9c98b58ffc1b] Took 0.37 seconds to deallocate network for instance. [ 1155.879706] env[69648]: INFO nova.scheduler.client.report [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Deleted allocations for instance 642ba6f1-b912-4f55-9199-9c98b58ffc1b [ 1155.904831] env[69648]: DEBUG oslo_concurrency.lockutils [None req-af9449c7-8874-486a-a3f5-cdb3137d52b3 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "642ba6f1-b912-4f55-9199-9c98b58ffc1b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 557.704s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.922291] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1155.971724] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1155.972030] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1155.973651] env[69648]: INFO nova.compute.claims [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1156.243875] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb0e2c29-3bf0-4672-9e5b-335fe9203b8d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.251571] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6950136-535c-4cd4-8901-b9c89eab590a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.280674] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac94720b-7ad0-43e9-b0da-5b8f22cf7519 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.287344] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d3b798e-490b-4196-9a93-cf790fab4f86 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.300907] env[69648]: DEBUG nova.compute.provider_tree [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1156.309504] env[69648]: DEBUG nova.scheduler.client.report [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1156.322200] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1156.322654] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1156.352121] env[69648]: DEBUG nova.compute.utils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1156.353466] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1156.353635] env[69648]: DEBUG nova.network.neutron [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1156.361132] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1156.414748] env[69648]: DEBUG nova.policy [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8e3a2ee2f914d76b7ab7b2643022886', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '088578c6ccac4c94985099bce136049a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1156.421640] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1156.446173] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1156.446431] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1156.446588] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1156.446767] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1156.446914] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1156.447076] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1156.447288] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1156.447449] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1156.447618] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1156.447780] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1156.447954] env[69648]: DEBUG nova.virt.hardware [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1156.448816] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25ffb5e4-8ab8-481c-94cf-c4d3c55628ef {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.456639] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-188cabcf-e41e-414c-bc58-d5dacc76b8c2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.709684] env[69648]: DEBUG nova.network.neutron [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Successfully created port: b06086fc-11a8-4c38-b201-482757f6bddc {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1157.591016] env[69648]: DEBUG nova.network.neutron [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Successfully updated port: b06086fc-11a8-4c38-b201-482757f6bddc {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1157.602420] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquiring lock "refresh_cache-d5fb115d-778d-4fc7-a03a-8f5828868a01" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1157.602513] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquired lock "refresh_cache-d5fb115d-778d-4fc7-a03a-8f5828868a01" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1157.603082] env[69648]: DEBUG nova.network.neutron [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1157.641355] env[69648]: DEBUG nova.network.neutron [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1157.794654] env[69648]: DEBUG nova.network.neutron [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Updating instance_info_cache with network_info: [{"id": "b06086fc-11a8-4c38-b201-482757f6bddc", "address": "fa:16:3e:e5:bb:7f", "network": {"id": "c193b124-aa1d-4cc9-8f8a-d43aa03becdf", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1201132275-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "088578c6ccac4c94985099bce136049a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e044cfd4-1b0d-4d88-b1bd-604025731d3f", "external-id": "nsx-vlan-transportzone-372", "segmentation_id": 372, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb06086fc-11", "ovs_interfaceid": "b06086fc-11a8-4c38-b201-482757f6bddc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1157.807750] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Releasing lock "refresh_cache-d5fb115d-778d-4fc7-a03a-8f5828868a01" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1157.808046] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Instance network_info: |[{"id": "b06086fc-11a8-4c38-b201-482757f6bddc", "address": "fa:16:3e:e5:bb:7f", "network": {"id": "c193b124-aa1d-4cc9-8f8a-d43aa03becdf", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1201132275-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "088578c6ccac4c94985099bce136049a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e044cfd4-1b0d-4d88-b1bd-604025731d3f", "external-id": "nsx-vlan-transportzone-372", "segmentation_id": 372, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb06086fc-11", "ovs_interfaceid": "b06086fc-11a8-4c38-b201-482757f6bddc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1157.808645] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e5:bb:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e044cfd4-1b0d-4d88-b1bd-604025731d3f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b06086fc-11a8-4c38-b201-482757f6bddc', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1157.816634] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Creating folder: Project (088578c6ccac4c94985099bce136049a). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1157.817138] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b0a520e9-b3df-4b71-a6f5-e9b089b2e119 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.828690] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Created folder: Project (088578c6ccac4c94985099bce136049a) in parent group-v692308. [ 1157.828870] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Creating folder: Instances. Parent ref: group-v692370. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1157.829088] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4f1a9063-d22d-4199-9bb6-881b4b455808 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.836723] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Created folder: Instances in parent group-v692370. [ 1157.836944] env[69648]: DEBUG oslo.service.loopingcall [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1157.837132] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1157.837323] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5264559f-c8e4-4b6b-a2ce-3d847032f1b9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.855247] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1157.855247] env[69648]: value = "task-3466559" [ 1157.855247] env[69648]: _type = "Task" [ 1157.855247] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1157.866157] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466559, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1158.066285] env[69648]: DEBUG nova.compute.manager [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Received event network-vif-plugged-b06086fc-11a8-4c38-b201-482757f6bddc {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1158.066514] env[69648]: DEBUG oslo_concurrency.lockutils [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] Acquiring lock "d5fb115d-778d-4fc7-a03a-8f5828868a01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1158.066733] env[69648]: DEBUG oslo_concurrency.lockutils [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1158.066903] env[69648]: DEBUG oslo_concurrency.lockutils [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1158.067476] env[69648]: DEBUG nova.compute.manager [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] No waiting events found dispatching network-vif-plugged-b06086fc-11a8-4c38-b201-482757f6bddc {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1158.067574] env[69648]: WARNING nova.compute.manager [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Received unexpected event network-vif-plugged-b06086fc-11a8-4c38-b201-482757f6bddc for instance with vm_state building and task_state spawning. [ 1158.067850] env[69648]: DEBUG nova.compute.manager [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Received event network-changed-b06086fc-11a8-4c38-b201-482757f6bddc {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1158.067939] env[69648]: DEBUG nova.compute.manager [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Refreshing instance network info cache due to event network-changed-b06086fc-11a8-4c38-b201-482757f6bddc. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1158.068201] env[69648]: DEBUG oslo_concurrency.lockutils [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] Acquiring lock "refresh_cache-d5fb115d-778d-4fc7-a03a-8f5828868a01" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1158.068316] env[69648]: DEBUG oslo_concurrency.lockutils [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] Acquired lock "refresh_cache-d5fb115d-778d-4fc7-a03a-8f5828868a01" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1158.068425] env[69648]: DEBUG nova.network.neutron [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Refreshing network info cache for port b06086fc-11a8-4c38-b201-482757f6bddc {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1158.369218] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466559, 'name': CreateVM_Task, 'duration_secs': 0.303185} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1158.369318] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1158.374036] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1158.374036] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1158.374036] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1158.374036] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-56c50e71-62e5-4d00-9874-01764d6085d0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.376506] env[69648]: DEBUG oslo_vmware.api [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Waiting for the task: (returnval){ [ 1158.376506] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52d5649a-f4d2-cdf8-3c1a-13bc3a8ed686" [ 1158.376506] env[69648]: _type = "Task" [ 1158.376506] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1158.387335] env[69648]: DEBUG oslo_vmware.api [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52d5649a-f4d2-cdf8-3c1a-13bc3a8ed686, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1158.411645] env[69648]: DEBUG nova.network.neutron [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Updated VIF entry in instance network info cache for port b06086fc-11a8-4c38-b201-482757f6bddc. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1158.412156] env[69648]: DEBUG nova.network.neutron [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Updating instance_info_cache with network_info: [{"id": "b06086fc-11a8-4c38-b201-482757f6bddc", "address": "fa:16:3e:e5:bb:7f", "network": {"id": "c193b124-aa1d-4cc9-8f8a-d43aa03becdf", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1201132275-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "088578c6ccac4c94985099bce136049a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e044cfd4-1b0d-4d88-b1bd-604025731d3f", "external-id": "nsx-vlan-transportzone-372", "segmentation_id": 372, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb06086fc-11", "ovs_interfaceid": "b06086fc-11a8-4c38-b201-482757f6bddc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1158.424477] env[69648]: DEBUG oslo_concurrency.lockutils [req-606ff4d3-6f67-4469-bd5b-9b4045488a27 req-2bf30115-ddfd-4cb4-ba12-39ca74a5098d service nova] Releasing lock "refresh_cache-d5fb115d-778d-4fc7-a03a-8f5828868a01" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1158.887188] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1158.887553] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1158.887641] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1173.971597] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_power_states {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1173.995330] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Getting list of instances from cluster (obj){ [ 1173.995330] env[69648]: value = "domain-c8" [ 1173.995330] env[69648]: _type = "ClusterComputeResource" [ 1173.995330] env[69648]: } {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1173.996727] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-419cfdcc-825f-499c-ba58-3104587c1b00 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.013937] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Got total of 10 instances {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1174.014127] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 91fcee48-3466-480d-bf87-bc4de17fbf31 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.014323] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.014521] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 63b167e7-3d86-4ee4-8bae-bfb8fe084135 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.014646] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 62954fe5-a462-40bd-85ec-d03b98d2ec42 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.014799] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 8e6a4fd6-5f80-476d-9789-adea1be2ae72 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.014950] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 60b00251-25fc-483d-88fe-a84165d6a435 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.015159] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 6062dd02-230d-42bc-8304-fc122f1f1489 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.015260] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid e64fd474-91ab-449e-8785-e788685ed77a {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.015411] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid fc2f697a-9f8c-4de1-a9a9-8606118663d7 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.015573] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid d5fb115d-778d-4fc7-a03a-8f5828868a01 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1174.015873] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "91fcee48-3466-480d-bf87-bc4de17fbf31" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.016125] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.016340] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.016538] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "62954fe5-a462-40bd-85ec-d03b98d2ec42" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.016732] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.016926] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "60b00251-25fc-483d-88fe-a84165d6a435" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.017142] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "6062dd02-230d-42bc-8304-fc122f1f1489" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.017342] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "e64fd474-91ab-449e-8785-e788685ed77a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.017542] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.017734] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "d5fb115d-778d-4fc7-a03a-8f5828868a01" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1187.112574] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1187.280939] env[69648]: DEBUG oslo_concurrency.lockutils [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquiring lock "d5fb115d-778d-4fc7-a03a-8f5828868a01" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.411727] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.411998] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1191.064925] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1192.060508] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1192.065164] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1193.065580] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1194.065053] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1194.065053] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1194.065053] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1194.087290] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.087892] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.088068] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.088239] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.088380] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.088511] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.088638] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.088766] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.088886] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.089009] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1194.089151] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1194.089692] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1194.089894] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1194.090042] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1195.065134] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1195.076698] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1195.076949] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1195.077145] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1195.077306] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1195.078469] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5efed377-bd56-4b0c-88f9-f50b6bdeb0ce {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.087357] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81732ad9-d1c0-412e-8a74-1b51165d8bab {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.101650] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46b3f547-1626-434e-b361-a10106ebe608 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.107972] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4621c534-decb-47b9-9c88-ba47b4390962 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.136915] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180968MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1195.137069] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1195.137267] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1195.204937] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.205126] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.205261] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.205388] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.205509] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.205627] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.205792] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.205917] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e64fd474-91ab-449e-8785-e788685ed77a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.206044] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.206160] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1195.217957] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3b09bb16-99c5-457a-aaf6-30f2c4d7dd32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.228812] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ab839f84-b864-409e-883d-00dddb5db3db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.241936] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ea075f2f-4f2d-4b1f-a6cd-e125b6554d24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.252435] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 09d8d722-d63e-4675-bd53-7862c677424d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.261409] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 93b15196-95be-471b-ab26-193e23e163ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.272251] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 163ce80a-d23b-43ea-8d19-a93fdce9e552 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.281384] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 468e1b4a-8701-4413-a836-b8377877ccf1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.290118] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.299199] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.309562] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1195.309795] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1195.309941] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1195.544067] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aff6fd13-0dee-43f2-a8e9-d037f8d463c0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.551931] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4d6aa3a-e6dd-4ca6-b417-ef90cccd287f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.582500] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9147742c-df08-41fc-9b54-99ed05184fe4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.588254] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76de9613-920e-4510-bde7-8afcd399f2cf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.600805] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1195.610508] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1195.623576] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1195.623782] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.486s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.849455] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "58804be5-ee46-4b25-be84-890d5cd1607f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1201.849755] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "58804be5-ee46-4b25-be84-890d5cd1607f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1201.885322] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "74a74c62-5c24-426e-ae6f-29511de99462" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1201.885322] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "74a74c62-5c24-426e-ae6f-29511de99462" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1203.903143] env[69648]: WARNING oslo_vmware.rw_handles [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1203.903143] env[69648]: ERROR oslo_vmware.rw_handles [ 1203.903143] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/c445a913-7f50-4edf-baad-37e02b96e4b0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1203.906943] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1203.907300] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Copying Virtual Disk [datastore1] vmware_temp/c445a913-7f50-4edf-baad-37e02b96e4b0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/c445a913-7f50-4edf-baad-37e02b96e4b0/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1203.908081] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a65786ee-03e3-49fd-af88-a4f2fe2e30e4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.918722] env[69648]: DEBUG oslo_vmware.api [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 1203.918722] env[69648]: value = "task-3466560" [ 1203.918722] env[69648]: _type = "Task" [ 1203.918722] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1203.927259] env[69648]: DEBUG oslo_vmware.api [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': task-3466560, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1204.432114] env[69648]: DEBUG oslo_vmware.exceptions [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1204.432114] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1204.432114] env[69648]: ERROR nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1204.432114] env[69648]: Faults: ['InvalidArgument'] [ 1204.432114] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Traceback (most recent call last): [ 1204.432114] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1204.432114] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] yield resources [ 1204.432114] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1204.432114] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] self.driver.spawn(context, instance, image_meta, [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] self._fetch_image_if_missing(context, vi) [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] image_cache(vi, tmp_image_ds_loc) [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] vm_util.copy_virtual_disk( [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] session._wait_for_task(vmdk_copy_task) [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] return self.wait_for_task(task_ref) [ 1204.432447] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] return evt.wait() [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] result = hub.switch() [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] return self.greenlet.switch() [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] self.f(*self.args, **self.kw) [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] raise exceptions.translate_fault(task_info.error) [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Faults: ['InvalidArgument'] [ 1204.432859] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] [ 1204.433194] env[69648]: INFO nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Terminating instance [ 1204.434714] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1204.434714] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1204.434975] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1204.435213] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1204.435956] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd14e9fd-5e18-4181-890b-3603996089d0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.441601] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-084de18f-4409-4ce8-a8d9-2426849e765d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.445852] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1204.446130] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e40d9de4-c7f9-4d67-9e12-c51a8ddc3e86 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.449095] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1204.449753] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1204.453687] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3cec7476-08bb-47d9-ab60-901859403662 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.458017] env[69648]: DEBUG oslo_vmware.api [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 1204.458017] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524d0bff-5635-dcf9-21f5-e566b78ce783" [ 1204.458017] env[69648]: _type = "Task" [ 1204.458017] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1204.464642] env[69648]: DEBUG oslo_vmware.api [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524d0bff-5635-dcf9-21f5-e566b78ce783, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1204.581228] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1204.581228] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1204.581228] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Deleting the datastore file [datastore1] 91fcee48-3466-480d-bf87-bc4de17fbf31 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1204.581228] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-40c4660b-df1b-4556-84b1-c38fbff59b23 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.588018] env[69648]: DEBUG oslo_vmware.api [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 1204.588018] env[69648]: value = "task-3466562" [ 1204.588018] env[69648]: _type = "Task" [ 1204.588018] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1204.595448] env[69648]: DEBUG oslo_vmware.api [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': task-3466562, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1204.966641] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1204.966938] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating directory with path [datastore1] vmware_temp/8b1e245e-a918-48bc-b253-e61c9f4bed15/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1204.967150] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-98f5dd7a-7598-40a7-942d-567a300c5cf4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.980508] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Created directory with path [datastore1] vmware_temp/8b1e245e-a918-48bc-b253-e61c9f4bed15/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1204.980759] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Fetch image to [datastore1] vmware_temp/8b1e245e-a918-48bc-b253-e61c9f4bed15/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1204.980921] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/8b1e245e-a918-48bc-b253-e61c9f4bed15/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1204.981840] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d123b7b-e05d-4adf-ad01-f96e865a8418 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.988804] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-782396ac-7020-4bb7-ba72-0af5631aa43c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.998194] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d70b20a4-cdc0-41af-af08-d053634634c9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1205.035321] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43dcc2ad-6bf1-485e-81ca-cc08008fadd6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1205.040668] env[69648]: DEBUG oslo_concurrency.lockutils [None req-8972f900-08bd-4550-9208-40a1747a698a tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "60cb1a27-ecc3-43a6-8efa-b54fd2f400ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1205.041108] env[69648]: DEBUG oslo_concurrency.lockutils [None req-8972f900-08bd-4550-9208-40a1747a698a tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "60cb1a27-ecc3-43a6-8efa-b54fd2f400ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1205.044945] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0fd1956f-513a-4b1d-a456-22bd8193fd3e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1205.064498] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1205.097820] env[69648]: DEBUG oslo_vmware.api [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': task-3466562, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079688} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1205.098285] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1205.099137] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1205.100068] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1205.100068] env[69648]: INFO nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Took 0.67 seconds to destroy the instance on the hypervisor. [ 1205.102709] env[69648]: DEBUG nova.compute.claims [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1205.103073] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1205.103591] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1205.227770] env[69648]: DEBUG oslo_vmware.rw_handles [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b1e245e-a918-48bc-b253-e61c9f4bed15/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1205.291761] env[69648]: DEBUG oslo_vmware.rw_handles [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1205.292356] env[69648]: DEBUG oslo_vmware.rw_handles [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b1e245e-a918-48bc-b253-e61c9f4bed15/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1205.590550] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94667899-c535-4a5b-ba43-9a43b7402381 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1205.598357] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5feb62dd-527b-4ac3-acd5-77ec58e6918d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1205.642538] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cafd4020-44a0-4383-baa5-609048a95e03 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1205.651455] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a1f1de4-4791-492f-9be0-77a7147ca2c0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1205.667626] env[69648]: DEBUG nova.compute.provider_tree [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1205.677836] env[69648]: DEBUG nova.scheduler.client.report [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1205.701236] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.597s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1205.701808] env[69648]: ERROR nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1205.701808] env[69648]: Faults: ['InvalidArgument'] [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Traceback (most recent call last): [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] self.driver.spawn(context, instance, image_meta, [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] self._fetch_image_if_missing(context, vi) [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] image_cache(vi, tmp_image_ds_loc) [ 1205.701808] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] vm_util.copy_virtual_disk( [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] session._wait_for_task(vmdk_copy_task) [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] return self.wait_for_task(task_ref) [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] return evt.wait() [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] result = hub.switch() [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] return self.greenlet.switch() [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1205.702220] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] self.f(*self.args, **self.kw) [ 1205.702613] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1205.702613] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] raise exceptions.translate_fault(task_info.error) [ 1205.702613] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1205.702613] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Faults: ['InvalidArgument'] [ 1205.702613] env[69648]: ERROR nova.compute.manager [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] [ 1205.702613] env[69648]: DEBUG nova.compute.utils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1205.704801] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Build of instance 91fcee48-3466-480d-bf87-bc4de17fbf31 was re-scheduled: A specified parameter was not correct: fileType [ 1205.704801] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1205.705221] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1205.705781] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1205.705781] env[69648]: DEBUG nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1205.705919] env[69648]: DEBUG nova.network.neutron [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1206.263196] env[69648]: DEBUG nova.network.neutron [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1206.280088] env[69648]: INFO nova.compute.manager [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Took 0.57 seconds to deallocate network for instance. [ 1206.398944] env[69648]: INFO nova.scheduler.client.report [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Deleted allocations for instance 91fcee48-3466-480d-bf87-bc4de17fbf31 [ 1206.451472] env[69648]: DEBUG oslo_concurrency.lockutils [None req-965bfc8e-8fea-46ac-990d-1ada018adf8b tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 604.371s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.456237] env[69648]: DEBUG oslo_concurrency.lockutils [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 207.362s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1206.456237] env[69648]: DEBUG oslo_concurrency.lockutils [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "91fcee48-3466-480d-bf87-bc4de17fbf31-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1206.456237] env[69648]: DEBUG oslo_concurrency.lockutils [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1206.456747] env[69648]: DEBUG oslo_concurrency.lockutils [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.458088] env[69648]: INFO nova.compute.manager [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Terminating instance [ 1206.463021] env[69648]: DEBUG nova.compute.manager [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1206.466020] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1206.466020] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ad6a71d3-ebff-4fcb-bedb-4fcec12880e2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.477045] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-656347c8-c523-4cf1-8210-c64789e4620a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.491765] env[69648]: DEBUG nova.compute.manager [None req-197cc09c-0aa3-40e4-8836-648405be5631 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: d19d0e28-8e92-4188-b570-0488fe81ba66] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1206.518623] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 91fcee48-3466-480d-bf87-bc4de17fbf31 could not be found. [ 1206.518837] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1206.519026] env[69648]: INFO nova.compute.manager [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Took 0.06 seconds to destroy the instance on the hypervisor. [ 1206.519311] env[69648]: DEBUG oslo.service.loopingcall [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1206.519555] env[69648]: DEBUG nova.compute.manager [-] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1206.519653] env[69648]: DEBUG nova.network.neutron [-] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1206.535055] env[69648]: DEBUG nova.compute.manager [None req-197cc09c-0aa3-40e4-8836-648405be5631 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: d19d0e28-8e92-4188-b570-0488fe81ba66] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1206.548744] env[69648]: DEBUG nova.network.neutron [-] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1206.556786] env[69648]: INFO nova.compute.manager [-] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] Took 0.04 seconds to deallocate network for instance. [ 1206.563030] env[69648]: DEBUG oslo_concurrency.lockutils [None req-197cc09c-0aa3-40e4-8836-648405be5631 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "d19d0e28-8e92-4188-b570-0488fe81ba66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.097s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.572898] env[69648]: DEBUG nova.compute.manager [None req-881f9027-949f-4ffe-a349-fe487d85920b tempest-ServersTestFqdnHostnames-1467993018 tempest-ServersTestFqdnHostnames-1467993018-project-member] [instance: 3b09bb16-99c5-457a-aaf6-30f2c4d7dd32] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1206.615522] env[69648]: DEBUG nova.compute.manager [None req-881f9027-949f-4ffe-a349-fe487d85920b tempest-ServersTestFqdnHostnames-1467993018 tempest-ServersTestFqdnHostnames-1467993018-project-member] [instance: 3b09bb16-99c5-457a-aaf6-30f2c4d7dd32] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1206.644074] env[69648]: DEBUG oslo_concurrency.lockutils [None req-881f9027-949f-4ffe-a349-fe487d85920b tempest-ServersTestFqdnHostnames-1467993018 tempest-ServersTestFqdnHostnames-1467993018-project-member] Lock "3b09bb16-99c5-457a-aaf6-30f2c4d7dd32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.037s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.667142] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1206.687502] env[69648]: DEBUG oslo_concurrency.lockutils [None req-6c9d0fc1-e039-48ca-85e1-b6d2d8a3958a tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.235s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.688257] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 32.672s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1206.688447] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 91fcee48-3466-480d-bf87-bc4de17fbf31] During sync_power_state the instance has a pending task (deleting). Skip. [ 1206.688625] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "91fcee48-3466-480d-bf87-bc4de17fbf31" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1206.718302] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1206.718550] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1206.720058] env[69648]: INFO nova.compute.claims [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1207.140319] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2ad55b8-b491-426b-9253-5fc034a24e29 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.156032] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06a5d6b2-eada-4bef-a53b-56c96158c8f5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.200736] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61052be-f870-4663-ad8c-0a16b2dc7b9a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.208790] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42b05c81-face-4c72-b0e4-d00b73dc76c9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.223048] env[69648]: DEBUG nova.compute.provider_tree [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1207.235898] env[69648]: DEBUG nova.scheduler.client.report [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1207.250658] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.532s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.251168] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1207.287641] env[69648]: DEBUG nova.compute.utils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1207.289127] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1207.289127] env[69648]: DEBUG nova.network.neutron [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1207.302240] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1207.382205] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1207.388016] env[69648]: DEBUG nova.policy [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de7a9b70cfd04f6ca0515f7ffa469085', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a5f578bc0e014b3aab62a1c7d737881b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1207.414334] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1207.414607] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1207.414767] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1207.414964] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1207.415304] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1207.415491] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1207.415717] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1207.415945] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1207.416123] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1207.416242] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1207.416402] env[69648]: DEBUG nova.virt.hardware [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1207.417300] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07a19cc5-f37f-4241-ab5e-d47d60f7f7a2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.426053] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5765e2c-2250-4197-863d-36e1a590f880 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.809856] env[69648]: DEBUG nova.network.neutron [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Successfully created port: cbac1472-c802-4b61-8851-cd7e4c57ecfb {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1208.714836] env[69648]: DEBUG nova.network.neutron [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Successfully updated port: cbac1472-c802-4b61-8851-cd7e4c57ecfb {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1208.728922] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquiring lock "refresh_cache-ab839f84-b864-409e-883d-00dddb5db3db" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1208.729201] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquired lock "refresh_cache-ab839f84-b864-409e-883d-00dddb5db3db" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1208.729251] env[69648]: DEBUG nova.network.neutron [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1208.778128] env[69648]: DEBUG nova.network.neutron [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1208.814235] env[69648]: DEBUG nova.compute.manager [req-ed12e265-9fbf-4076-a031-4664702dcca2 req-f969048f-4f8f-47eb-b3b2-fdbc14cec05b service nova] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Received event network-vif-plugged-cbac1472-c802-4b61-8851-cd7e4c57ecfb {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1208.814472] env[69648]: DEBUG oslo_concurrency.lockutils [req-ed12e265-9fbf-4076-a031-4664702dcca2 req-f969048f-4f8f-47eb-b3b2-fdbc14cec05b service nova] Acquiring lock "ab839f84-b864-409e-883d-00dddb5db3db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1208.814682] env[69648]: DEBUG oslo_concurrency.lockutils [req-ed12e265-9fbf-4076-a031-4664702dcca2 req-f969048f-4f8f-47eb-b3b2-fdbc14cec05b service nova] Lock "ab839f84-b864-409e-883d-00dddb5db3db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1208.814899] env[69648]: DEBUG oslo_concurrency.lockutils [req-ed12e265-9fbf-4076-a031-4664702dcca2 req-f969048f-4f8f-47eb-b3b2-fdbc14cec05b service nova] Lock "ab839f84-b864-409e-883d-00dddb5db3db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1208.815032] env[69648]: DEBUG nova.compute.manager [req-ed12e265-9fbf-4076-a031-4664702dcca2 req-f969048f-4f8f-47eb-b3b2-fdbc14cec05b service nova] [instance: ab839f84-b864-409e-883d-00dddb5db3db] No waiting events found dispatching network-vif-plugged-cbac1472-c802-4b61-8851-cd7e4c57ecfb {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1208.815203] env[69648]: WARNING nova.compute.manager [req-ed12e265-9fbf-4076-a031-4664702dcca2 req-f969048f-4f8f-47eb-b3b2-fdbc14cec05b service nova] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Received unexpected event network-vif-plugged-cbac1472-c802-4b61-8851-cd7e4c57ecfb for instance with vm_state building and task_state spawning. [ 1209.043276] env[69648]: DEBUG nova.network.neutron [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Updating instance_info_cache with network_info: [{"id": "cbac1472-c802-4b61-8851-cd7e4c57ecfb", "address": "fa:16:3e:e4:96:d6", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcbac1472-c8", "ovs_interfaceid": "cbac1472-c802-4b61-8851-cd7e4c57ecfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1209.057869] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Releasing lock "refresh_cache-ab839f84-b864-409e-883d-00dddb5db3db" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1209.058589] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Instance network_info: |[{"id": "cbac1472-c802-4b61-8851-cd7e4c57ecfb", "address": "fa:16:3e:e4:96:d6", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcbac1472-c8", "ovs_interfaceid": "cbac1472-c802-4b61-8851-cd7e4c57ecfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1209.058727] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e4:96:d6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92233552-2c0c-416e-9bf3-bfcca8eda2dc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cbac1472-c802-4b61-8851-cd7e4c57ecfb', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1209.066934] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Creating folder: Project (a5f578bc0e014b3aab62a1c7d737881b). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1209.067585] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-01d1c510-a834-472f-91ed-4aa0602ab211 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.079233] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Created folder: Project (a5f578bc0e014b3aab62a1c7d737881b) in parent group-v692308. [ 1209.079445] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Creating folder: Instances. Parent ref: group-v692373. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1209.079693] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eb44d985-a9a9-4ce6-9a8f-7c13ac5b771d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.088364] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Created folder: Instances in parent group-v692373. [ 1209.088612] env[69648]: DEBUG oslo.service.loopingcall [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1209.088800] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1209.089012] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8457b854-88fc-4031-8f61-27ebb322ae12 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.109656] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1209.109656] env[69648]: value = "task-3466565" [ 1209.109656] env[69648]: _type = "Task" [ 1209.109656] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1209.117387] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466565, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1209.126137] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquiring lock "c97308be-406b-4fd0-b502-69e8c800773f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1209.126365] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "c97308be-406b-4fd0-b502-69e8c800773f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1209.621211] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466565, 'name': CreateVM_Task, 'duration_secs': 0.340506} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1209.621418] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1209.628298] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1209.628472] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1209.628797] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1209.629057] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2567e296-3852-46f1-af5e-66bb79b4bca0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.633862] env[69648]: DEBUG oslo_vmware.api [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Waiting for the task: (returnval){ [ 1209.633862] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52ae303e-e22f-2c1a-a7c4-7b78b57abf83" [ 1209.633862] env[69648]: _type = "Task" [ 1209.633862] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1209.642181] env[69648]: DEBUG oslo_vmware.api [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52ae303e-e22f-2c1a-a7c4-7b78b57abf83, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1210.143816] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1210.144114] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1210.144336] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1210.843282] env[69648]: DEBUG nova.compute.manager [req-6cd13e24-0eed-442e-b3d2-f31a9161f229 req-63f41269-280f-4c9d-a4cc-09de671b1324 service nova] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Received event network-changed-cbac1472-c802-4b61-8851-cd7e4c57ecfb {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1210.843282] env[69648]: DEBUG nova.compute.manager [req-6cd13e24-0eed-442e-b3d2-f31a9161f229 req-63f41269-280f-4c9d-a4cc-09de671b1324 service nova] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Refreshing instance network info cache due to event network-changed-cbac1472-c802-4b61-8851-cd7e4c57ecfb. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1210.843282] env[69648]: DEBUG oslo_concurrency.lockutils [req-6cd13e24-0eed-442e-b3d2-f31a9161f229 req-63f41269-280f-4c9d-a4cc-09de671b1324 service nova] Acquiring lock "refresh_cache-ab839f84-b864-409e-883d-00dddb5db3db" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1210.843282] env[69648]: DEBUG oslo_concurrency.lockutils [req-6cd13e24-0eed-442e-b3d2-f31a9161f229 req-63f41269-280f-4c9d-a4cc-09de671b1324 service nova] Acquired lock "refresh_cache-ab839f84-b864-409e-883d-00dddb5db3db" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1210.843449] env[69648]: DEBUG nova.network.neutron [req-6cd13e24-0eed-442e-b3d2-f31a9161f229 req-63f41269-280f-4c9d-a4cc-09de671b1324 service nova] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Refreshing network info cache for port cbac1472-c802-4b61-8851-cd7e4c57ecfb {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1211.113907] env[69648]: DEBUG nova.network.neutron [req-6cd13e24-0eed-442e-b3d2-f31a9161f229 req-63f41269-280f-4c9d-a4cc-09de671b1324 service nova] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Updated VIF entry in instance network info cache for port cbac1472-c802-4b61-8851-cd7e4c57ecfb. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1211.114292] env[69648]: DEBUG nova.network.neutron [req-6cd13e24-0eed-442e-b3d2-f31a9161f229 req-63f41269-280f-4c9d-a4cc-09de671b1324 service nova] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Updating instance_info_cache with network_info: [{"id": "cbac1472-c802-4b61-8851-cd7e4c57ecfb", "address": "fa:16:3e:e4:96:d6", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcbac1472-c8", "ovs_interfaceid": "cbac1472-c802-4b61-8851-cd7e4c57ecfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1211.125791] env[69648]: DEBUG oslo_concurrency.lockutils [req-6cd13e24-0eed-442e-b3d2-f31a9161f229 req-63f41269-280f-4c9d-a4cc-09de671b1324 service nova] Releasing lock "refresh_cache-ab839f84-b864-409e-883d-00dddb5db3db" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1214.147624] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4c083434-c107-4bfd-bcc5-68e9c3b0983f tempest-ServerShowV254Test-458023032 tempest-ServerShowV254Test-458023032-project-member] Acquiring lock "8426cc36-f026-46d9-844e-432343410efe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1214.147970] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4c083434-c107-4bfd-bcc5-68e9c3b0983f tempest-ServerShowV254Test-458023032 tempest-ServerShowV254Test-458023032-project-member] Lock "8426cc36-f026-46d9-844e-432343410efe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1221.233772] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquiring lock "ab839f84-b864-409e-883d-00dddb5db3db" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1226.170497] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e6c10fe-2829-47ed-91bc-db2424e0e8cd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "27bd5158-0b5f-408d-b91c-e7f3fbde894e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1226.170800] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e6c10fe-2829-47ed-91bc-db2424e0e8cd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "27bd5158-0b5f-408d-b91c-e7f3fbde894e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1242.198833] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1e59b601-1e4b-4be2-a779-3a16609aa839 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "0af9540b-b092-4396-8573-cdadc66abe02" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1242.199140] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1e59b601-1e4b-4be2-a779-3a16609aa839 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "0af9540b-b092-4396-8573-cdadc66abe02" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1243.787461] env[69648]: DEBUG oslo_concurrency.lockutils [None req-21e577e4-a8c6-41b2-9138-20ab717439fe tempest-ServerRescueTestJSON-1037422394 tempest-ServerRescueTestJSON-1037422394-project-member] Acquiring lock "0e63df98-0345-4d75-b128-291048849e40" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1243.787779] env[69648]: DEBUG oslo_concurrency.lockutils [None req-21e577e4-a8c6-41b2-9138-20ab717439fe tempest-ServerRescueTestJSON-1037422394 tempest-ServerRescueTestJSON-1037422394-project-member] Lock "0e63df98-0345-4d75-b128-291048849e40" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1247.169887] env[69648]: DEBUG oslo_concurrency.lockutils [None req-afa03ea8-6f97-41f2-bc03-199d24794c07 tempest-ServersV294TestFqdnHostnames-720767939 tempest-ServersV294TestFqdnHostnames-720767939-project-member] Acquiring lock "b7cd17f7-89f5-4f85-8964-532467432b59" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.170198] env[69648]: DEBUG oslo_concurrency.lockutils [None req-afa03ea8-6f97-41f2-bc03-199d24794c07 tempest-ServersV294TestFqdnHostnames-720767939 tempest-ServersV294TestFqdnHostnames-720767939-project-member] Lock "b7cd17f7-89f5-4f85-8964-532467432b59" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1248.624098] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1251.061328] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1251.085318] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1251.630818] env[69648]: WARNING oslo_vmware.rw_handles [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1251.630818] env[69648]: ERROR oslo_vmware.rw_handles [ 1251.631397] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/8b1e245e-a918-48bc-b253-e61c9f4bed15/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1251.633315] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1251.633581] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Copying Virtual Disk [datastore1] vmware_temp/8b1e245e-a918-48bc-b253-e61c9f4bed15/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/8b1e245e-a918-48bc-b253-e61c9f4bed15/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1251.633858] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-45ff673e-522e-49fd-9e25-117a1b7203b5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1251.641600] env[69648]: DEBUG oslo_vmware.api [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 1251.641600] env[69648]: value = "task-3466566" [ 1251.641600] env[69648]: _type = "Task" [ 1251.641600] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1251.649261] env[69648]: DEBUG oslo_vmware.api [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': task-3466566, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1252.152129] env[69648]: DEBUG oslo_vmware.exceptions [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1252.152444] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1252.152951] env[69648]: ERROR nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1252.152951] env[69648]: Faults: ['InvalidArgument'] [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Traceback (most recent call last): [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] yield resources [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] self.driver.spawn(context, instance, image_meta, [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] self._fetch_image_if_missing(context, vi) [ 1252.152951] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] image_cache(vi, tmp_image_ds_loc) [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] vm_util.copy_virtual_disk( [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] session._wait_for_task(vmdk_copy_task) [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] return self.wait_for_task(task_ref) [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] return evt.wait() [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] result = hub.switch() [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1252.153419] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] return self.greenlet.switch() [ 1252.153865] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1252.153865] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] self.f(*self.args, **self.kw) [ 1252.153865] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1252.153865] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] raise exceptions.translate_fault(task_info.error) [ 1252.153865] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1252.153865] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Faults: ['InvalidArgument'] [ 1252.153865] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] [ 1252.153865] env[69648]: INFO nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Terminating instance [ 1252.154867] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1252.155943] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1252.156588] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1252.156782] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1252.157011] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3f4034d6-6549-4fd8-8c48-0c4b0c79cbd7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.159429] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d84a4bf-659a-4bba-8c4b-d0b8dd3dcd16 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.166201] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1252.166372] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dabcee6a-059c-4f7b-95b7-dbacb9348a87 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.168620] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1252.168789] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1252.169782] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a2725d9e-10bc-4c36-949d-40ad2bfdf053 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.174428] env[69648]: DEBUG oslo_vmware.api [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Waiting for the task: (returnval){ [ 1252.174428] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]522812e0-3514-f9ac-5253-a4596c5caff7" [ 1252.174428] env[69648]: _type = "Task" [ 1252.174428] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1252.181965] env[69648]: DEBUG oslo_vmware.api [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]522812e0-3514-f9ac-5253-a4596c5caff7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1252.233030] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1252.233320] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1252.233505] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Deleting the datastore file [datastore1] 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1252.233762] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-32fd3da9-5c15-41a3-b1a6-b5cf307ba659 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.240952] env[69648]: DEBUG oslo_vmware.api [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for the task: (returnval){ [ 1252.240952] env[69648]: value = "task-3466568" [ 1252.240952] env[69648]: _type = "Task" [ 1252.240952] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1252.249906] env[69648]: DEBUG oslo_vmware.api [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': task-3466568, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1252.685230] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1252.685496] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Creating directory with path [datastore1] vmware_temp/53ce4c80-335b-4ea1-8097-36f2d5d6efa6/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1252.685754] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-182c95be-edde-4317-9469-e0d77a1de58b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.698054] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Created directory with path [datastore1] vmware_temp/53ce4c80-335b-4ea1-8097-36f2d5d6efa6/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1252.698661] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Fetch image to [datastore1] vmware_temp/53ce4c80-335b-4ea1-8097-36f2d5d6efa6/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1252.698661] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/53ce4c80-335b-4ea1-8097-36f2d5d6efa6/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1252.699131] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ab99dc0-e116-4972-9755-af0f07410335 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.706023] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aea0a13a-0975-4752-8fa6-18f453c005fe {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.715304] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e376701-9df7-48b5-bd5a-1b9f3985ce14 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.747423] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c269665-fd9d-474f-8cc4-794716d487ad {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.755172] env[69648]: DEBUG oslo_vmware.api [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Task: {'id': task-3466568, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.087087} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1252.755639] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1252.755818] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1252.755989] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1252.756179] env[69648]: INFO nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1252.757921] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a6277565-189d-409b-a766-bacbf5590b11 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.759738] env[69648]: DEBUG nova.compute.claims [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1252.759911] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1252.760168] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1252.784387] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1252.936397] env[69648]: DEBUG oslo_vmware.rw_handles [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/53ce4c80-335b-4ea1-8097-36f2d5d6efa6/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1252.995831] env[69648]: DEBUG oslo_vmware.rw_handles [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1252.996033] env[69648]: DEBUG oslo_vmware.rw_handles [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/53ce4c80-335b-4ea1-8097-36f2d5d6efa6/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1253.093643] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b11851ce-19bb-4e3b-9d89-cbb5674e5447 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.102497] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e991e4f5-6307-4b43-a1ab-0f5403e61fe4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.131805] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-064ec47e-0d6c-4d4f-8d55-2b645205d215 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.138785] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f90c6390-f282-464a-990e-34e2e5f3cf4a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.151719] env[69648]: DEBUG nova.compute.provider_tree [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1253.160432] env[69648]: DEBUG nova.scheduler.client.report [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1253.173863] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.414s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1253.174417] env[69648]: ERROR nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1253.174417] env[69648]: Faults: ['InvalidArgument'] [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Traceback (most recent call last): [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] self.driver.spawn(context, instance, image_meta, [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] self._fetch_image_if_missing(context, vi) [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] image_cache(vi, tmp_image_ds_loc) [ 1253.174417] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] vm_util.copy_virtual_disk( [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] session._wait_for_task(vmdk_copy_task) [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] return self.wait_for_task(task_ref) [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] return evt.wait() [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] result = hub.switch() [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] return self.greenlet.switch() [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1253.174862] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] self.f(*self.args, **self.kw) [ 1253.175260] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1253.175260] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] raise exceptions.translate_fault(task_info.error) [ 1253.175260] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1253.175260] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Faults: ['InvalidArgument'] [ 1253.175260] env[69648]: ERROR nova.compute.manager [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] [ 1253.175260] env[69648]: DEBUG nova.compute.utils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1253.176524] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Build of instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 was re-scheduled: A specified parameter was not correct: fileType [ 1253.176524] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1253.176974] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1253.177263] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1253.177547] env[69648]: DEBUG nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1253.177800] env[69648]: DEBUG nova.network.neutron [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1253.786405] env[69648]: DEBUG nova.network.neutron [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1253.797168] env[69648]: INFO nova.compute.manager [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Took 0.62 seconds to deallocate network for instance. [ 1253.894347] env[69648]: INFO nova.scheduler.client.report [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Deleted allocations for instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 [ 1253.914074] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f009881c-79e9-42fd-ba9f-cd2500bed6d6 tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 649.236s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1253.915367] env[69648]: DEBUG oslo_concurrency.lockutils [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 451.030s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1253.915628] env[69648]: DEBUG oslo_concurrency.lockutils [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Acquiring lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1253.915852] env[69648]: DEBUG oslo_concurrency.lockutils [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1253.916051] env[69648]: DEBUG oslo_concurrency.lockutils [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1253.918762] env[69648]: INFO nova.compute.manager [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Terminating instance [ 1253.920601] env[69648]: DEBUG nova.compute.manager [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1253.920892] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1253.921371] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3dbd40fe-a1f4-41e7-8cea-c2eb978b6ac5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.930762] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0588aa4a-91ba-4b33-be59-c602c927a217 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.943327] env[69648]: DEBUG nova.compute.manager [None req-376ef2f5-d682-435b-aaa5-e8afa8c1c078 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: ea075f2f-4f2d-4b1f-a6cd-e125b6554d24] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1253.965072] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 45ccc6ec-6501-4477-9b94-1c0e3d1271d9 could not be found. [ 1253.965072] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1253.965072] env[69648]: INFO nova.compute.manager [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1253.965292] env[69648]: DEBUG oslo.service.loopingcall [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1253.965466] env[69648]: DEBUG nova.compute.manager [-] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1253.965566] env[69648]: DEBUG nova.network.neutron [-] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1253.971971] env[69648]: DEBUG nova.compute.manager [None req-376ef2f5-d682-435b-aaa5-e8afa8c1c078 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: ea075f2f-4f2d-4b1f-a6cd-e125b6554d24] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1253.990813] env[69648]: DEBUG nova.network.neutron [-] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1253.994960] env[69648]: DEBUG oslo_concurrency.lockutils [None req-376ef2f5-d682-435b-aaa5-e8afa8c1c078 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "ea075f2f-4f2d-4b1f-a6cd-e125b6554d24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.448s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1253.999817] env[69648]: INFO nova.compute.manager [-] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] Took 0.03 seconds to deallocate network for instance. [ 1254.004931] env[69648]: DEBUG nova.compute.manager [None req-a13e343a-bdb6-47df-97e0-430a30cb050e tempest-ServerTagsTestJSON-1019271742 tempest-ServerTagsTestJSON-1019271742-project-member] [instance: 09d8d722-d63e-4675-bd53-7862c677424d] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1254.029283] env[69648]: DEBUG nova.compute.manager [None req-a13e343a-bdb6-47df-97e0-430a30cb050e tempest-ServerTagsTestJSON-1019271742 tempest-ServerTagsTestJSON-1019271742-project-member] [instance: 09d8d722-d63e-4675-bd53-7862c677424d] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1254.058310] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a13e343a-bdb6-47df-97e0-430a30cb050e tempest-ServerTagsTestJSON-1019271742 tempest-ServerTagsTestJSON-1019271742-project-member] Lock "09d8d722-d63e-4675-bd53-7862c677424d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.689s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.065117] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1254.065625] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1254.065625] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1254.067525] env[69648]: DEBUG nova.compute.manager [None req-e09a69ac-f613-4e7a-a503-2634acf964c7 tempest-AttachInterfacesUnderV243Test-278770139 tempest-AttachInterfacesUnderV243Test-278770139-project-member] [instance: 93b15196-95be-471b-ab26-193e23e163ba] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1254.098729] env[69648]: DEBUG nova.compute.manager [None req-e09a69ac-f613-4e7a-a503-2634acf964c7 tempest-AttachInterfacesUnderV243Test-278770139 tempest-AttachInterfacesUnderV243Test-278770139-project-member] [instance: 93b15196-95be-471b-ab26-193e23e163ba] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1254.117784] env[69648]: DEBUG oslo_concurrency.lockutils [None req-760e3608-900a-40f4-a3dd-087b7c134ced tempest-ListServerFiltersTestJSON-1244123173 tempest-ListServerFiltersTestJSON-1244123173-project-member] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.202s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.119209] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 80.103s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1254.119405] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 45ccc6ec-6501-4477-9b94-1c0e3d1271d9] During sync_power_state the instance has a pending task (deleting). Skip. [ 1254.119656] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "45ccc6ec-6501-4477-9b94-1c0e3d1271d9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.128849] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e09a69ac-f613-4e7a-a503-2634acf964c7 tempest-AttachInterfacesUnderV243Test-278770139 tempest-AttachInterfacesUnderV243Test-278770139-project-member] Lock "93b15196-95be-471b-ab26-193e23e163ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.057s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.138102] env[69648]: DEBUG nova.compute.manager [None req-18c9d602-d624-4cf3-99fd-52f314023991 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] [instance: 163ce80a-d23b-43ea-8d19-a93fdce9e552] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1254.161862] env[69648]: DEBUG nova.compute.manager [None req-18c9d602-d624-4cf3-99fd-52f314023991 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] [instance: 163ce80a-d23b-43ea-8d19-a93fdce9e552] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1254.181605] env[69648]: DEBUG oslo_concurrency.lockutils [None req-18c9d602-d624-4cf3-99fd-52f314023991 tempest-AttachInterfacesTestJSON-1223796609 tempest-AttachInterfacesTestJSON-1223796609-project-member] Lock "163ce80a-d23b-43ea-8d19-a93fdce9e552" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.318s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.191710] env[69648]: DEBUG nova.compute.manager [None req-3d686f58-4968-4e49-b376-be3880882d15 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 468e1b4a-8701-4413-a836-b8377877ccf1] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1254.215146] env[69648]: DEBUG nova.compute.manager [None req-3d686f58-4968-4e49-b376-be3880882d15 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 468e1b4a-8701-4413-a836-b8377877ccf1] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1254.235624] env[69648]: DEBUG oslo_concurrency.lockutils [None req-3d686f58-4968-4e49-b376-be3880882d15 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "468e1b4a-8701-4413-a836-b8377877ccf1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.073s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.245700] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1254.297623] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1254.297906] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1254.299382] env[69648]: INFO nova.compute.claims [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1254.590157] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef4c83bf-eca4-4ac8-a33d-d2b8bd3fc569 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.597393] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-943151c9-b728-48b6-b24f-133722c31a1b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.627008] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79f006d5-b840-4897-9579-d69d706fe9fe {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.633934] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51ec49a9-fa3f-4b61-8462-b051b3d026f3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.646577] env[69648]: DEBUG nova.compute.provider_tree [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1254.655956] env[69648]: DEBUG nova.scheduler.client.report [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1254.672193] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.374s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.672691] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1254.704907] env[69648]: DEBUG nova.compute.utils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1254.706414] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1254.706786] env[69648]: DEBUG nova.network.neutron [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1254.716026] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1254.779993] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1254.784531] env[69648]: DEBUG nova.policy [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb4543cd552342bfa20048ac159b36c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fac4f94511c14c86b107001987d773d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1254.805399] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1254.805399] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1254.805399] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1254.805624] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1254.805624] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1254.805764] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1254.805957] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1254.806135] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1254.806307] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1254.806476] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1254.806649] env[69648]: DEBUG nova.virt.hardware [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1254.807690] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-485d285f-0ee2-44fc-bae6-71e5576f1ab2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.815290] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9856074d-6547-4041-b249-16c866c066a5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.064993] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1255.065276] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1255.065440] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1255.150274] env[69648]: DEBUG nova.network.neutron [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Successfully created port: b2b172d3-6246-4317-93a6-cf5a0c18ec1f {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1255.856721] env[69648]: DEBUG nova.network.neutron [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Successfully updated port: b2b172d3-6246-4317-93a6-cf5a0c18ec1f {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1255.862376] env[69648]: DEBUG nova.compute.manager [req-89088704-a5dc-42ce-91b0-5772b1cbe864 req-90145a81-c22d-426c-a0fb-e14a1e8bb42a service nova] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Received event network-vif-plugged-b2b172d3-6246-4317-93a6-cf5a0c18ec1f {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1255.862376] env[69648]: DEBUG oslo_concurrency.lockutils [req-89088704-a5dc-42ce-91b0-5772b1cbe864 req-90145a81-c22d-426c-a0fb-e14a1e8bb42a service nova] Acquiring lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1255.862376] env[69648]: DEBUG oslo_concurrency.lockutils [req-89088704-a5dc-42ce-91b0-5772b1cbe864 req-90145a81-c22d-426c-a0fb-e14a1e8bb42a service nova] Lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1255.862376] env[69648]: DEBUG oslo_concurrency.lockutils [req-89088704-a5dc-42ce-91b0-5772b1cbe864 req-90145a81-c22d-426c-a0fb-e14a1e8bb42a service nova] Lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1255.862546] env[69648]: DEBUG nova.compute.manager [req-89088704-a5dc-42ce-91b0-5772b1cbe864 req-90145a81-c22d-426c-a0fb-e14a1e8bb42a service nova] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] No waiting events found dispatching network-vif-plugged-b2b172d3-6246-4317-93a6-cf5a0c18ec1f {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1255.862546] env[69648]: WARNING nova.compute.manager [req-89088704-a5dc-42ce-91b0-5772b1cbe864 req-90145a81-c22d-426c-a0fb-e14a1e8bb42a service nova] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Received unexpected event network-vif-plugged-b2b172d3-6246-4317-93a6-cf5a0c18ec1f for instance with vm_state building and task_state spawning. [ 1255.890793] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "refresh_cache-147e1f39-c2ae-410e-9b62-cd56b5978e1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1255.890958] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquired lock "refresh_cache-147e1f39-c2ae-410e-9b62-cd56b5978e1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1255.891247] env[69648]: DEBUG nova.network.neutron [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1255.932017] env[69648]: DEBUG nova.network.neutron [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1256.065154] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1256.065287] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1256.065401] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1256.089591] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.089835] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.089991] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.090139] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.090272] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.090390] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.090525] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.090702] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.090860] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.090960] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1256.091144] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1256.129369] env[69648]: DEBUG nova.network.neutron [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Updating instance_info_cache with network_info: [{"id": "b2b172d3-6246-4317-93a6-cf5a0c18ec1f", "address": "fa:16:3e:61:a6:f5", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.83", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2b172d3-62", "ovs_interfaceid": "b2b172d3-6246-4317-93a6-cf5a0c18ec1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1256.143181] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Releasing lock "refresh_cache-147e1f39-c2ae-410e-9b62-cd56b5978e1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1256.143483] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Instance network_info: |[{"id": "b2b172d3-6246-4317-93a6-cf5a0c18ec1f", "address": "fa:16:3e:61:a6:f5", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.83", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2b172d3-62", "ovs_interfaceid": "b2b172d3-6246-4317-93a6-cf5a0c18ec1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1256.143865] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:61:a6:f5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92233552-2c0c-416e-9bf3-bfcca8eda2dc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b2b172d3-6246-4317-93a6-cf5a0c18ec1f', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1256.151230] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Creating folder: Project (fac4f94511c14c86b107001987d773d6). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1256.151707] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-39007d29-9193-4076-885a-37b4014c6359 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.162674] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Created folder: Project (fac4f94511c14c86b107001987d773d6) in parent group-v692308. [ 1256.162857] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Creating folder: Instances. Parent ref: group-v692376. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1256.163116] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-199abae9-9021-4bb4-8069-8225e5607d87 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.171820] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Created folder: Instances in parent group-v692376. [ 1256.172058] env[69648]: DEBUG oslo.service.loopingcall [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1256.172239] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1256.172426] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e99e268b-aec4-4494-9078-bb071b7a302c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.192029] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1256.192029] env[69648]: value = "task-3466571" [ 1256.192029] env[69648]: _type = "Task" [ 1256.192029] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1256.199015] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466571, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1256.701188] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466571, 'name': CreateVM_Task, 'duration_secs': 0.301084} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1256.701379] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1256.702064] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1256.702235] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1256.702537] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1256.702788] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-14b315fa-0553-4db5-9a4f-3d55150f5970 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.707286] env[69648]: DEBUG oslo_vmware.api [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for the task: (returnval){ [ 1256.707286] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5260c769-40d5-db30-9387-4022effe3e04" [ 1256.707286] env[69648]: _type = "Task" [ 1256.707286] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1256.714625] env[69648]: DEBUG oslo_vmware.api [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5260c769-40d5-db30-9387-4022effe3e04, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1257.065219] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1257.078411] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1257.078597] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1257.078764] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.078924] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1257.080157] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bf3007f-eb37-4baa-a24f-ccedf789c812 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.089047] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88f8b61a-5aff-4e15-a111-8dd9bd7b2c7c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.102655] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c21669a9-705f-4157-a6db-d20fb1436144 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.109045] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e067bba0-1b1d-45ab-b147-3de870dc799f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.139956] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180980MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1257.140129] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1257.140324] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1257.208784] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.208950] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.209093] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.209245] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.209371] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.209489] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e64fd474-91ab-449e-8785-e788685ed77a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.209608] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.209724] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.209837] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ab839f84-b864-409e-883d-00dddb5db3db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.209947] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1257.220739] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1257.220988] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1257.221213] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1257.221763] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.231721] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.241120] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 58804be5-ee46-4b25-be84-890d5cd1607f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.251119] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 74a74c62-5c24-426e-ae6f-29511de99462 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.260014] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60cb1a27-ecc3-43a6-8efa-b54fd2f400ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.270144] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.281094] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8426cc36-f026-46d9-844e-432343410efe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.290485] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 27bd5158-0b5f-408d-b91c-e7f3fbde894e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.300520] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0af9540b-b092-4396-8573-cdadc66abe02 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.309871] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0e63df98-0345-4d75-b128-291048849e40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.319547] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b7cd17f7-89f5-4f85-8964-532467432b59 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1257.319790] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1257.319949] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1257.550680] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17588885-cbbc-4476-b69a-6fb271b0d49b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.558298] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39a9edc2-5e56-4970-9e67-37efb5218010 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.588983] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9991c55c-b2ee-43e0-98b5-b534e21d8ffa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.596040] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49a806dd-e4b0-4c23-ab27-99688371a609 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.613066] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1257.628032] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1257.647639] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1257.647921] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.507s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.903810] env[69648]: DEBUG nova.compute.manager [req-96e274de-cb9e-4685-ac96-aef91173b7bb req-328a89f5-f528-4451-bb63-285555f67a7a service nova] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Received event network-changed-b2b172d3-6246-4317-93a6-cf5a0c18ec1f {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1257.904037] env[69648]: DEBUG nova.compute.manager [req-96e274de-cb9e-4685-ac96-aef91173b7bb req-328a89f5-f528-4451-bb63-285555f67a7a service nova] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Refreshing instance network info cache due to event network-changed-b2b172d3-6246-4317-93a6-cf5a0c18ec1f. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1257.904264] env[69648]: DEBUG oslo_concurrency.lockutils [req-96e274de-cb9e-4685-ac96-aef91173b7bb req-328a89f5-f528-4451-bb63-285555f67a7a service nova] Acquiring lock "refresh_cache-147e1f39-c2ae-410e-9b62-cd56b5978e1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1257.904429] env[69648]: DEBUG oslo_concurrency.lockutils [req-96e274de-cb9e-4685-ac96-aef91173b7bb req-328a89f5-f528-4451-bb63-285555f67a7a service nova] Acquired lock "refresh_cache-147e1f39-c2ae-410e-9b62-cd56b5978e1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1257.904598] env[69648]: DEBUG nova.network.neutron [req-96e274de-cb9e-4685-ac96-aef91173b7bb req-328a89f5-f528-4451-bb63-285555f67a7a service nova] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Refreshing network info cache for port b2b172d3-6246-4317-93a6-cf5a0c18ec1f {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1258.198636] env[69648]: DEBUG nova.network.neutron [req-96e274de-cb9e-4685-ac96-aef91173b7bb req-328a89f5-f528-4451-bb63-285555f67a7a service nova] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Updated VIF entry in instance network info cache for port b2b172d3-6246-4317-93a6-cf5a0c18ec1f. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1258.199070] env[69648]: DEBUG nova.network.neutron [req-96e274de-cb9e-4685-ac96-aef91173b7bb req-328a89f5-f528-4451-bb63-285555f67a7a service nova] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Updating instance_info_cache with network_info: [{"id": "b2b172d3-6246-4317-93a6-cf5a0c18ec1f", "address": "fa:16:3e:61:a6:f5", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.83", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2b172d3-62", "ovs_interfaceid": "b2b172d3-6246-4317-93a6-cf5a0c18ec1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1258.208626] env[69648]: DEBUG oslo_concurrency.lockutils [req-96e274de-cb9e-4685-ac96-aef91173b7bb req-328a89f5-f528-4451-bb63-285555f67a7a service nova] Releasing lock "refresh_cache-147e1f39-c2ae-410e-9b62-cd56b5978e1b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1300.731814] env[69648]: WARNING oslo_vmware.rw_handles [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1300.731814] env[69648]: ERROR oslo_vmware.rw_handles [ 1300.732550] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/53ce4c80-335b-4ea1-8097-36f2d5d6efa6/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1300.734448] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1300.734733] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Copying Virtual Disk [datastore1] vmware_temp/53ce4c80-335b-4ea1-8097-36f2d5d6efa6/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/53ce4c80-335b-4ea1-8097-36f2d5d6efa6/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1300.735029] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f972867e-6a08-48a5-9382-c768f5764ed6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.744898] env[69648]: DEBUG oslo_vmware.api [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Waiting for the task: (returnval){ [ 1300.744898] env[69648]: value = "task-3466572" [ 1300.744898] env[69648]: _type = "Task" [ 1300.744898] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1300.753818] env[69648]: DEBUG oslo_vmware.api [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Task: {'id': task-3466572, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1301.256030] env[69648]: DEBUG oslo_vmware.exceptions [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1301.256420] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1301.256970] env[69648]: ERROR nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1301.256970] env[69648]: Faults: ['InvalidArgument'] [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Traceback (most recent call last): [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] yield resources [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] self.driver.spawn(context, instance, image_meta, [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] self._fetch_image_if_missing(context, vi) [ 1301.256970] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] image_cache(vi, tmp_image_ds_loc) [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] vm_util.copy_virtual_disk( [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] session._wait_for_task(vmdk_copy_task) [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] return self.wait_for_task(task_ref) [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] return evt.wait() [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] result = hub.switch() [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1301.257457] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] return self.greenlet.switch() [ 1301.257928] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1301.257928] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] self.f(*self.args, **self.kw) [ 1301.257928] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1301.257928] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] raise exceptions.translate_fault(task_info.error) [ 1301.257928] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1301.257928] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Faults: ['InvalidArgument'] [ 1301.257928] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] [ 1301.257928] env[69648]: INFO nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Terminating instance [ 1301.258877] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1301.259131] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1301.259374] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1c94336f-4ee7-4dab-b411-d7a05441060f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.264009] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1301.264216] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1301.264962] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c306659d-547d-49ab-943e-246f1fb8b6a2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.268706] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1301.268880] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1301.269895] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4d5964b7-6070-48cf-8ee6-be84b32d26ed {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.273807] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1301.274314] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8a6ebf71-a7f0-463e-a29d-a0154f9e39f7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.276832] env[69648]: DEBUG oslo_vmware.api [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Waiting for the task: (returnval){ [ 1301.276832] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528ddd96-12b6-6871-ab4b-3acda4a1f976" [ 1301.276832] env[69648]: _type = "Task" [ 1301.276832] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1301.292931] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1301.293174] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Creating directory with path [datastore1] vmware_temp/7f57ce04-3b0d-4352-9cb3-a135c27ae046/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1301.293397] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9f120aa0-cdcf-428e-a03a-d375f36e71d4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.315211] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Created directory with path [datastore1] vmware_temp/7f57ce04-3b0d-4352-9cb3-a135c27ae046/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1301.315417] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Fetch image to [datastore1] vmware_temp/7f57ce04-3b0d-4352-9cb3-a135c27ae046/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1301.315590] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/7f57ce04-3b0d-4352-9cb3-a135c27ae046/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1301.316409] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f8015b6-e9f0-4c09-91ba-af41ab10093f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.323406] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee8ec5e0-5767-49b4-9947-f9febb95be6a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.332909] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d945d31-9194-438f-b9ef-e57a4cfa3593 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.365321] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a6e9273-3247-4f06-b0ea-1c789ab385fc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.368141] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1301.368346] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1301.368523] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Deleting the datastore file [datastore1] 63b167e7-3d86-4ee4-8bae-bfb8fe084135 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1301.368823] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3a111323-c3e0-4a0f-b521-cd9b8b93d39b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.375224] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f78994b2-c660-4d44-83b0-62e2c5fb6285 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.376936] env[69648]: DEBUG oslo_vmware.api [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Waiting for the task: (returnval){ [ 1301.376936] env[69648]: value = "task-3466574" [ 1301.376936] env[69648]: _type = "Task" [ 1301.376936] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1301.395910] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1301.550077] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1301.551694] env[69648]: ERROR nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] result = getattr(controller, method)(*args, **kwargs) [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._get(image_id) [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1301.551694] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] resp, body = self.http_client.get(url, headers=header) [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.request(url, 'GET', **kwargs) [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._handle_response(resp) [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise exc.from_response(resp, resp.content) [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] During handling of the above exception, another exception occurred: [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1301.552149] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] yield resources [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self.driver.spawn(context, instance, image_meta, [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._fetch_image_if_missing(context, vi) [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] image_fetch(context, vi, tmp_image_ds_loc) [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] images.fetch_image( [ 1301.552641] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] metadata = IMAGE_API.get(context, image_ref) [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return session.show(context, image_id, [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] _reraise_translated_image_exception(image_id) [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise new_exc.with_traceback(exc_trace) [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] result = getattr(controller, method)(*args, **kwargs) [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1301.553126] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._get(image_id) [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] resp, body = self.http_client.get(url, headers=header) [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.request(url, 'GET', **kwargs) [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._handle_response(resp) [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise exc.from_response(resp, resp.content) [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1301.553558] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1301.554015] env[69648]: INFO nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Terminating instance [ 1301.554015] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1301.554015] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1301.554157] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-12c8bf59-ea51-4783-8efa-22e34de1b0fb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.559171] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1301.559171] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1301.559456] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2f82003-310c-4ba8-897f-986eea1e37bd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.566517] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1301.567111] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bb281192-7dda-48bf-8fc1-84a4c9e912a3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.569049] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1301.569312] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1301.570333] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-141d0f72-ec67-42c0-8674-91f00a3726a4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.575393] env[69648]: DEBUG oslo_vmware.api [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Waiting for the task: (returnval){ [ 1301.575393] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]522c0e0b-5356-1735-3073-5e0ac5fd7475" [ 1301.575393] env[69648]: _type = "Task" [ 1301.575393] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1301.590289] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1301.590527] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Creating directory with path [datastore1] vmware_temp/b51c3ba4-dbf6-467d-bbc4-cd7c7aa569cb/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1301.590736] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ba3cf1e5-5d68-4508-b066-678140c16865 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.616137] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Created directory with path [datastore1] vmware_temp/b51c3ba4-dbf6-467d-bbc4-cd7c7aa569cb/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1301.616350] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Fetch image to [datastore1] vmware_temp/b51c3ba4-dbf6-467d-bbc4-cd7c7aa569cb/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1301.616529] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/b51c3ba4-dbf6-467d-bbc4-cd7c7aa569cb/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1301.617314] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97162a36-835e-4eb8-b180-9b513791265d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.624408] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8d1e47c-4955-4d67-b7c7-60962672da75 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.633888] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89f5ae35-dcee-427f-bcaa-72cc6e1c20f6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.643892] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1301.644123] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1301.644307] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Deleting the datastore file [datastore1] 62954fe5-a462-40bd-85ec-d03b98d2ec42 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1301.668662] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-999cf01f-e7ab-4601-98dd-2611fdd65aef {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.671247] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7deabe19-845a-4b89-a64f-88dea2c5252d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.679245] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-59b4219f-be66-4a3f-ae82-dce91b35a9b4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.681040] env[69648]: DEBUG oslo_vmware.api [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Waiting for the task: (returnval){ [ 1301.681040] env[69648]: value = "task-3466576" [ 1301.681040] env[69648]: _type = "Task" [ 1301.681040] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1301.688968] env[69648]: DEBUG oslo_vmware.api [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Task: {'id': task-3466576, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1301.711963] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1301.764768] env[69648]: DEBUG oslo_vmware.rw_handles [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b51c3ba4-dbf6-467d-bbc4-cd7c7aa569cb/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1301.823326] env[69648]: DEBUG oslo_vmware.rw_handles [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1301.823536] env[69648]: DEBUG oslo_vmware.rw_handles [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b51c3ba4-dbf6-467d-bbc4-cd7c7aa569cb/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1301.887837] env[69648]: DEBUG oslo_vmware.api [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Task: {'id': task-3466574, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084488} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1301.888064] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1301.888256] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1301.888429] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1301.888602] env[69648]: INFO nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1301.890787] env[69648]: DEBUG nova.compute.claims [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1301.890963] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1301.891200] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1302.186857] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82d52bd5-7e1f-4a7c-b02d-9dba6e5579dc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.198442] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d505da00-5101-4fc0-add2-76cc451f9c52 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.201490] env[69648]: DEBUG oslo_vmware.api [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Task: {'id': task-3466576, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075835} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1302.201750] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1302.201915] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1302.202102] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1302.202274] env[69648]: INFO nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1302.230291] env[69648]: DEBUG nova.compute.claims [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1302.230488] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1302.231209] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4b86622-1642-4560-bfbf-042eb280eab3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.238790] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fefc21d0-f7b9-4054-bc56-2ccc814bc96f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.252138] env[69648]: DEBUG nova.compute.provider_tree [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1302.262029] env[69648]: DEBUG nova.scheduler.client.report [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1302.275980] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.385s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1302.276537] env[69648]: ERROR nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1302.276537] env[69648]: Faults: ['InvalidArgument'] [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Traceback (most recent call last): [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] self.driver.spawn(context, instance, image_meta, [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] self._fetch_image_if_missing(context, vi) [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] image_cache(vi, tmp_image_ds_loc) [ 1302.276537] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] vm_util.copy_virtual_disk( [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] session._wait_for_task(vmdk_copy_task) [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] return self.wait_for_task(task_ref) [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] return evt.wait() [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] result = hub.switch() [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] return self.greenlet.switch() [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1302.276948] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] self.f(*self.args, **self.kw) [ 1302.277406] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1302.277406] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] raise exceptions.translate_fault(task_info.error) [ 1302.277406] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1302.277406] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Faults: ['InvalidArgument'] [ 1302.277406] env[69648]: ERROR nova.compute.manager [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] [ 1302.277406] env[69648]: DEBUG nova.compute.utils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1302.278756] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Build of instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 was re-scheduled: A specified parameter was not correct: fileType [ 1302.278756] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1302.279184] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1302.279367] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1302.279549] env[69648]: DEBUG nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1302.279716] env[69648]: DEBUG nova.network.neutron [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1302.281388] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.051s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1302.555353] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78179e96-e3ca-4d31-a99d-17a2d49b6c37 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.563801] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20862aed-299b-4435-bf30-75ace3142924 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.594664] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-449ab488-fd1d-417c-942e-8b0f3f1156af {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.602119] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-332ff19b-2aa1-4f21-9f35-9c7ed08f6973 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.615248] env[69648]: DEBUG nova.compute.provider_tree [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1302.627860] env[69648]: DEBUG nova.scheduler.client.report [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1302.644664] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.363s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1302.645409] env[69648]: ERROR nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] result = getattr(controller, method)(*args, **kwargs) [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._get(image_id) [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1302.645409] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] resp, body = self.http_client.get(url, headers=header) [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.request(url, 'GET', **kwargs) [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._handle_response(resp) [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise exc.from_response(resp, resp.content) [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] During handling of the above exception, another exception occurred: [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.645789] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self.driver.spawn(context, instance, image_meta, [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._fetch_image_if_missing(context, vi) [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] image_fetch(context, vi, tmp_image_ds_loc) [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] images.fetch_image( [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] metadata = IMAGE_API.get(context, image_ref) [ 1302.646197] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return session.show(context, image_id, [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] _reraise_translated_image_exception(image_id) [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise new_exc.with_traceback(exc_trace) [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] result = getattr(controller, method)(*args, **kwargs) [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._get(image_id) [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1302.649036] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] resp, body = self.http_client.get(url, headers=header) [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.request(url, 'GET', **kwargs) [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._handle_response(resp) [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise exc.from_response(resp, resp.content) [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1302.649498] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.649791] env[69648]: DEBUG nova.compute.utils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1302.649791] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Build of instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 was re-scheduled: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1302.649791] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1302.649791] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1302.649965] env[69648]: DEBUG nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1302.649965] env[69648]: DEBUG nova.network.neutron [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1302.656637] env[69648]: DEBUG nova.network.neutron [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1302.665485] env[69648]: INFO nova.compute.manager [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Took 0.39 seconds to deallocate network for instance. [ 1302.778021] env[69648]: INFO nova.scheduler.client.report [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Deleted allocations for instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 [ 1302.788187] env[69648]: DEBUG neutronclient.v2_0.client [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=69648) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1302.790527] env[69648]: ERROR nova.compute.manager [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] result = getattr(controller, method)(*args, **kwargs) [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._get(image_id) [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1302.790527] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] resp, body = self.http_client.get(url, headers=header) [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.request(url, 'GET', **kwargs) [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._handle_response(resp) [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise exc.from_response(resp, resp.content) [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] During handling of the above exception, another exception occurred: [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.790925] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self.driver.spawn(context, instance, image_meta, [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._fetch_image_if_missing(context, vi) [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] image_fetch(context, vi, tmp_image_ds_loc) [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] images.fetch_image( [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] metadata = IMAGE_API.get(context, image_ref) [ 1302.791325] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return session.show(context, image_id, [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] _reraise_translated_image_exception(image_id) [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise new_exc.with_traceback(exc_trace) [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] result = getattr(controller, method)(*args, **kwargs) [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._get(image_id) [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1302.791763] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] resp, body = self.http_client.get(url, headers=header) [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.request(url, 'GET', **kwargs) [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self._handle_response(resp) [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise exc.from_response(resp, resp.content) [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] During handling of the above exception, another exception occurred: [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.792141] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 2431, in _do_build_and_run_instance [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._build_and_run_instance(context, instance, image, [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 2723, in _build_and_run_instance [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise exception.RescheduledException( [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] nova.exception.RescheduledException: Build of instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 was re-scheduled: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] During handling of the above exception, another exception occurred: [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1302.792569] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] exception_handler_v20(status_code, error_body) [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise client_exc(message=error_message, [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Neutron server returns request_ids: ['req-dde062b5-4e08-4be6-b4d0-d80541d8969a'] [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] During handling of the above exception, another exception occurred: [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 3020, in _cleanup_allocated_networks [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._deallocate_network(context, instance, requested_networks) [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self.network_api.deallocate_for_instance( [ 1302.792965] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] data = neutron.list_ports(**search_opts) [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.list('ports', self.ports_path, retrieve_all, [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] for r in self._pagination(collection, path, **params): [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] res = self.get(path, params=params) [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1302.793488] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.retry_request("GET", action, body=body, [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.do_request(method, action, body=body, [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._handle_fault_response(status_code, replybody, resp) [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise exception.Unauthorized() [ 1302.793914] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] nova.exception.Unauthorized: Not authorized. [ 1302.794429] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1302.804021] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b95485e6-76bd-4102-9ba0-e66a793fecab tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 679.012s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1302.805433] env[69648]: DEBUG oslo_concurrency.lockutils [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 481.946s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1302.806268] env[69648]: DEBUG oslo_concurrency.lockutils [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquiring lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1302.806484] env[69648]: DEBUG oslo_concurrency.lockutils [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1302.806747] env[69648]: DEBUG oslo_concurrency.lockutils [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1302.808702] env[69648]: INFO nova.compute.manager [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Terminating instance [ 1302.810461] env[69648]: DEBUG oslo_concurrency.lockutils [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquiring lock "refresh_cache-63b167e7-3d86-4ee4-8bae-bfb8fe084135" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1302.810461] env[69648]: DEBUG oslo_concurrency.lockutils [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Acquired lock "refresh_cache-63b167e7-3d86-4ee4-8bae-bfb8fe084135" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1302.810461] env[69648]: DEBUG nova.network.neutron [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1302.815227] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1302.854561] env[69648]: DEBUG nova.network.neutron [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1302.865080] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1302.865422] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1302.867034] env[69648]: INFO nova.compute.claims [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1302.870099] env[69648]: INFO nova.scheduler.client.report [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Deleted allocations for instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 [ 1302.897160] env[69648]: DEBUG oslo_concurrency.lockutils [None req-22ce4079-b2e7-43c9-a207-4ed041a6fe44 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 628.134s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1302.898300] env[69648]: DEBUG oslo_concurrency.lockutils [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.391s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1302.898574] env[69648]: DEBUG oslo_concurrency.lockutils [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquiring lock "62954fe5-a462-40bd-85ec-d03b98d2ec42-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1302.898736] env[69648]: DEBUG oslo_concurrency.lockutils [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1302.898905] env[69648]: DEBUG oslo_concurrency.lockutils [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1302.901426] env[69648]: INFO nova.compute.manager [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Terminating instance [ 1302.902765] env[69648]: DEBUG oslo_concurrency.lockutils [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquiring lock "refresh_cache-62954fe5-a462-40bd-85ec-d03b98d2ec42" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1302.903029] env[69648]: DEBUG oslo_concurrency.lockutils [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Acquired lock "refresh_cache-62954fe5-a462-40bd-85ec-d03b98d2ec42" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1302.903127] env[69648]: DEBUG nova.network.neutron [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1302.914343] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1302.972142] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1303.016243] env[69648]: DEBUG nova.network.neutron [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1303.029081] env[69648]: DEBUG oslo_concurrency.lockutils [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Releasing lock "refresh_cache-63b167e7-3d86-4ee4-8bae-bfb8fe084135" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1303.029666] env[69648]: DEBUG nova.compute.manager [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1303.029737] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1303.030437] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d1d82945-9c4a-46df-83fe-08f3984e57df {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.040939] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8299c91c-702d-40e6-9349-eae2446e3e18 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.072856] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 63b167e7-3d86-4ee4-8bae-bfb8fe084135 could not be found. [ 1303.073086] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1303.073269] env[69648]: INFO nova.compute.manager [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1303.073516] env[69648]: DEBUG oslo.service.loopingcall [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1303.075889] env[69648]: DEBUG nova.compute.manager [-] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1303.075990] env[69648]: DEBUG nova.network.neutron [-] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1303.094697] env[69648]: DEBUG nova.network.neutron [-] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1303.107955] env[69648]: DEBUG nova.network.neutron [-] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1303.119016] env[69648]: INFO nova.compute.manager [-] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] Took 0.04 seconds to deallocate network for instance. [ 1303.217555] env[69648]: DEBUG oslo_concurrency.lockutils [None req-56b7424c-a570-499f-9723-ac72d4266544 tempest-ServerActionsTestJSON-1586963996 tempest-ServerActionsTestJSON-1586963996-project-member] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.412s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1303.218411] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 129.202s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1303.218591] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 63b167e7-3d86-4ee4-8bae-bfb8fe084135] During sync_power_state the instance has a pending task (deleting). Skip. [ 1303.218765] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "63b167e7-3d86-4ee4-8bae-bfb8fe084135" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1303.225631] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c864ec2b-cd4d-457d-82ef-175e719dabee {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.239138] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-896a1dd6-21b0-4ffc-b34d-5b44cb19a56b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.270213] env[69648]: DEBUG nova.network.neutron [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Updating instance_info_cache with network_info: [{"id": "a2948875-9ca8-4e4e-8724-545119122a58", "address": "fa:16:3e:06:88:66", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2948875-9c", "ovs_interfaceid": "a2948875-9ca8-4e4e-8724-545119122a58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1303.271723] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21fe7a1a-1acb-42b9-bf08-b1abb7892153 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.280238] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaa0eff2-d426-4e83-ae3a-a86d4131b6c6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.284292] env[69648]: DEBUG oslo_concurrency.lockutils [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Releasing lock "refresh_cache-62954fe5-a462-40bd-85ec-d03b98d2ec42" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1303.284721] env[69648]: DEBUG nova.compute.manager [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1303.284886] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1303.285354] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2e0e8e22-078b-4817-bde0-92db7df370b7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.296828] env[69648]: DEBUG nova.compute.provider_tree [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1303.301212] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c00f9f1-75aa-4564-b4db-9f9d9c077989 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.312425] env[69648]: DEBUG nova.scheduler.client.report [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1303.333459] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 62954fe5-a462-40bd-85ec-d03b98d2ec42 could not be found. [ 1303.333653] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1303.333858] env[69648]: INFO nova.compute.manager [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1303.334121] env[69648]: DEBUG oslo.service.loopingcall [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1303.334722] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.469s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1303.335222] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1303.337347] env[69648]: DEBUG nova.compute.manager [-] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1303.337451] env[69648]: DEBUG nova.network.neutron [-] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1303.339441] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.368s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1303.340772] env[69648]: INFO nova.compute.claims [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1303.370698] env[69648]: DEBUG nova.compute.utils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1303.373084] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1303.373283] env[69648]: DEBUG nova.network.neutron [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1303.381913] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1303.450014] env[69648]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=69648) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1303.450014] env[69648]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1303.450014] env[69648]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1303.450014] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1303.450014] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.450014] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1303.450014] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1303.450014] env[69648]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1303.450014] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1303.450014] env[69648]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1303.450014] env[69648]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-592a5c4e-a22d-4a76-a651-55862491cb10'] [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1303.450906] env[69648]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1303.480928] env[69648]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1303.482084] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1303.482084] env[69648]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1303.482084] env[69648]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1303.482084] env[69648]: ERROR oslo.service.loopingcall [ 1303.482084] env[69648]: ERROR nova.compute.manager [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1303.482084] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1303.499771] env[69648]: ERROR nova.compute.manager [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] exception_handler_v20(status_code, error_body) [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise client_exc(message=error_message, [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Neutron server returns request_ids: ['req-592a5c4e-a22d-4a76-a651-55862491cb10'] [ 1303.499771] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] During handling of the above exception, another exception occurred: [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Traceback (most recent call last): [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._delete_instance(context, instance, bdms) [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._shutdown_instance(context, instance, bdms) [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._try_deallocate_network(context, instance, requested_networks) [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] with excutils.save_and_reraise_exception(): [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1303.499980] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self.force_reraise() [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise self.value [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] _deallocate_network_with_retries() [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return evt.wait() [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] result = hub.switch() [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.greenlet.switch() [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1303.500260] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] result = func(*self.args, **self.kw) [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] result = f(*args, **kwargs) [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._deallocate_network( [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self.network_api.deallocate_for_instance( [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] data = neutron.list_ports(**search_opts) [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.list('ports', self.ports_path, retrieve_all, [ 1303.500495] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] for r in self._pagination(collection, path, **params): [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] res = self.get(path, params=params) [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.retry_request("GET", action, body=body, [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1303.500716] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] return self.do_request(method, action, body=body, [ 1303.500962] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.500962] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] ret = obj(*args, **kwargs) [ 1303.500962] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1303.500962] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] self._handle_fault_response(status_code, replybody, resp) [ 1303.500962] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1303.500962] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1303.500962] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1303.500962] env[69648]: ERROR nova.compute.manager [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] [ 1303.507740] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1303.508123] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1303.508363] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1303.508603] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1303.508784] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1303.508966] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1303.509206] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1303.509456] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1303.509656] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1303.509917] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1303.510162] env[69648]: DEBUG nova.virt.hardware [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1303.511595] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0f73627-55dc-4086-874e-4ce9f26b62fc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.524207] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31e30267-1478-4c66-b50b-2ef2ab56d626 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.530330] env[69648]: DEBUG nova.policy [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb4543cd552342bfa20048ac159b36c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fac4f94511c14c86b107001987d773d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1303.547540] env[69648]: DEBUG oslo_concurrency.lockutils [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.649s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1303.549028] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 129.532s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1303.549230] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] During sync_power_state the instance has a pending task (deleting). Skip. [ 1303.549406] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "62954fe5-a462-40bd-85ec-d03b98d2ec42" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1303.595863] env[69648]: INFO nova.compute.manager [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] [instance: 62954fe5-a462-40bd-85ec-d03b98d2ec42] Successfully reverted task state from None on failure for instance. [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server [None req-bc243a44-4f70-4b7a-881c-ed3361d39106 tempest-ServerExternalEventsTest-2006061798 tempest-ServerExternalEventsTest-2006061798-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-592a5c4e-a22d-4a76-a651-55862491cb10'] [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1303.599516] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1303.599853] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1303.600186] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1303.600607] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1303.600941] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1303.601282] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1303.601587] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1303.601587] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1303.601587] env[69648]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1303.601587] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1303.601587] env[69648]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1303.601587] env[69648]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1303.601587] env[69648]: ERROR oslo_messaging.rpc.server [ 1303.671357] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ccc1483-abcc-4da9-8627-b0a6c572c594 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.679211] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-499ee4d8-7c07-4ee1-aaf4-d7d93eea939f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.709784] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ab06283-b0b6-4951-ac66-889bea68f238 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.717373] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9285af20-911f-4e82-b51c-14e9201b0e95 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.730736] env[69648]: DEBUG nova.compute.provider_tree [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1303.739852] env[69648]: DEBUG nova.scheduler.client.report [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1303.754799] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.415s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1303.755309] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1303.788652] env[69648]: DEBUG nova.compute.utils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1303.790139] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1303.790364] env[69648]: DEBUG nova.network.neutron [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1303.806254] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1303.846599] env[69648]: DEBUG nova.policy [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd02320b12288496eae0a735447321a7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '896367398859465488fc12205d122a4e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1303.871573] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1303.900020] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1303.900135] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1303.900241] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1303.900429] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1303.900580] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1303.901473] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1303.901473] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1303.901473] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1303.901473] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1303.901473] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1303.901625] env[69648]: DEBUG nova.virt.hardware [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1303.902631] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a14d63f-e170-4d1d-a4c5-ea9ce283b60a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.912819] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eba1d74-1349-4243-b86d-626aaac28568 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.938018] env[69648]: DEBUG nova.network.neutron [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Successfully created port: 683ba951-6968-469d-996b-303a43d8206b {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1304.271291] env[69648]: DEBUG nova.network.neutron [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Successfully created port: 8ba5496b-0209-43dc-bf01-d969c4cafaf4 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1304.715872] env[69648]: DEBUG nova.compute.manager [req-bc93f57e-b522-40e2-b4a7-e4b5408caa6e req-03573a4c-4a4e-44d1-9d3e-5b239c40c04a service nova] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Received event network-vif-plugged-683ba951-6968-469d-996b-303a43d8206b {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1304.716114] env[69648]: DEBUG oslo_concurrency.lockutils [req-bc93f57e-b522-40e2-b4a7-e4b5408caa6e req-03573a4c-4a4e-44d1-9d3e-5b239c40c04a service nova] Acquiring lock "a924bdee-1e16-4d78-ac6b-9574677de55f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1304.716334] env[69648]: DEBUG oslo_concurrency.lockutils [req-bc93f57e-b522-40e2-b4a7-e4b5408caa6e req-03573a4c-4a4e-44d1-9d3e-5b239c40c04a service nova] Lock "a924bdee-1e16-4d78-ac6b-9574677de55f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1304.716505] env[69648]: DEBUG oslo_concurrency.lockutils [req-bc93f57e-b522-40e2-b4a7-e4b5408caa6e req-03573a4c-4a4e-44d1-9d3e-5b239c40c04a service nova] Lock "a924bdee-1e16-4d78-ac6b-9574677de55f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1304.716671] env[69648]: DEBUG nova.compute.manager [req-bc93f57e-b522-40e2-b4a7-e4b5408caa6e req-03573a4c-4a4e-44d1-9d3e-5b239c40c04a service nova] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] No waiting events found dispatching network-vif-plugged-683ba951-6968-469d-996b-303a43d8206b {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1304.716839] env[69648]: WARNING nova.compute.manager [req-bc93f57e-b522-40e2-b4a7-e4b5408caa6e req-03573a4c-4a4e-44d1-9d3e-5b239c40c04a service nova] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Received unexpected event network-vif-plugged-683ba951-6968-469d-996b-303a43d8206b for instance with vm_state building and task_state spawning. [ 1304.811347] env[69648]: DEBUG nova.network.neutron [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Successfully updated port: 683ba951-6968-469d-996b-303a43d8206b {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1304.828801] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "refresh_cache-a924bdee-1e16-4d78-ac6b-9574677de55f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1304.828961] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquired lock "refresh_cache-a924bdee-1e16-4d78-ac6b-9574677de55f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1304.829163] env[69648]: DEBUG nova.network.neutron [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1304.895604] env[69648]: DEBUG nova.network.neutron [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1305.093166] env[69648]: DEBUG nova.network.neutron [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Updating instance_info_cache with network_info: [{"id": "683ba951-6968-469d-996b-303a43d8206b", "address": "fa:16:3e:4e:00:0b", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.178", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap683ba951-69", "ovs_interfaceid": "683ba951-6968-469d-996b-303a43d8206b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1305.113674] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Releasing lock "refresh_cache-a924bdee-1e16-4d78-ac6b-9574677de55f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1305.114087] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Instance network_info: |[{"id": "683ba951-6968-469d-996b-303a43d8206b", "address": "fa:16:3e:4e:00:0b", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.178", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap683ba951-69", "ovs_interfaceid": "683ba951-6968-469d-996b-303a43d8206b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1305.114549] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4e:00:0b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92233552-2c0c-416e-9bf3-bfcca8eda2dc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '683ba951-6968-469d-996b-303a43d8206b', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1305.122560] env[69648]: DEBUG oslo.service.loopingcall [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1305.123101] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1305.123791] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c72d8566-f1e8-43e1-bd0d-73838db50829 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1305.143765] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1305.143765] env[69648]: value = "task-3466577" [ 1305.143765] env[69648]: _type = "Task" [ 1305.143765] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1305.153888] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466577, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1305.156680] env[69648]: DEBUG nova.compute.manager [req-f080fa20-1ed5-4fd0-a3d0-c0dde2979a23 req-e97ec5c5-dbc3-462a-a6a4-0e70295c2227 service nova] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Received event network-vif-plugged-8ba5496b-0209-43dc-bf01-d969c4cafaf4 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1305.156922] env[69648]: DEBUG oslo_concurrency.lockutils [req-f080fa20-1ed5-4fd0-a3d0-c0dde2979a23 req-e97ec5c5-dbc3-462a-a6a4-0e70295c2227 service nova] Acquiring lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1305.157145] env[69648]: DEBUG oslo_concurrency.lockutils [req-f080fa20-1ed5-4fd0-a3d0-c0dde2979a23 req-e97ec5c5-dbc3-462a-a6a4-0e70295c2227 service nova] Lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1305.157311] env[69648]: DEBUG oslo_concurrency.lockutils [req-f080fa20-1ed5-4fd0-a3d0-c0dde2979a23 req-e97ec5c5-dbc3-462a-a6a4-0e70295c2227 service nova] Lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1305.157476] env[69648]: DEBUG nova.compute.manager [req-f080fa20-1ed5-4fd0-a3d0-c0dde2979a23 req-e97ec5c5-dbc3-462a-a6a4-0e70295c2227 service nova] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] No waiting events found dispatching network-vif-plugged-8ba5496b-0209-43dc-bf01-d969c4cafaf4 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1305.157637] env[69648]: WARNING nova.compute.manager [req-f080fa20-1ed5-4fd0-a3d0-c0dde2979a23 req-e97ec5c5-dbc3-462a-a6a4-0e70295c2227 service nova] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Received unexpected event network-vif-plugged-8ba5496b-0209-43dc-bf01-d969c4cafaf4 for instance with vm_state building and task_state spawning. [ 1305.220213] env[69648]: DEBUG nova.network.neutron [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Successfully updated port: 8ba5496b-0209-43dc-bf01-d969c4cafaf4 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1305.231621] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "refresh_cache-ba0b4adc-fa4a-4b36-bb86-58ff038c834e" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1305.231841] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired lock "refresh_cache-ba0b4adc-fa4a-4b36-bb86-58ff038c834e" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1305.232010] env[69648]: DEBUG nova.network.neutron [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1305.275123] env[69648]: DEBUG nova.network.neutron [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1305.498884] env[69648]: DEBUG nova.network.neutron [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Updating instance_info_cache with network_info: [{"id": "8ba5496b-0209-43dc-bf01-d969c4cafaf4", "address": "fa:16:3e:60:0b:57", "network": {"id": "58c9db42-24fd-4615-9f91-977554db657a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-27923875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "896367398859465488fc12205d122a4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8ba5496b-02", "ovs_interfaceid": "8ba5496b-0209-43dc-bf01-d969c4cafaf4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1305.511058] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Releasing lock "refresh_cache-ba0b4adc-fa4a-4b36-bb86-58ff038c834e" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1305.511379] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Instance network_info: |[{"id": "8ba5496b-0209-43dc-bf01-d969c4cafaf4", "address": "fa:16:3e:60:0b:57", "network": {"id": "58c9db42-24fd-4615-9f91-977554db657a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-27923875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "896367398859465488fc12205d122a4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8ba5496b-02", "ovs_interfaceid": "8ba5496b-0209-43dc-bf01-d969c4cafaf4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1305.511886] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:60:0b:57', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '52f465cb-7418-4172-bd7d-aec00abeb692', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8ba5496b-0209-43dc-bf01-d969c4cafaf4', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1305.519465] env[69648]: DEBUG oslo.service.loopingcall [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1305.519973] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1305.520229] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b549f112-133f-416f-9f7f-3f0b6d343957 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1305.539801] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1305.539801] env[69648]: value = "task-3466578" [ 1305.539801] env[69648]: _type = "Task" [ 1305.539801] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1305.548556] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466578, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1305.654244] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466577, 'name': CreateVM_Task, 'duration_secs': 0.299176} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1305.654431] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1305.655136] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1305.655322] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1305.655664] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1305.655931] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-61b6322d-252e-4b62-8c4a-6986a41d4595 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1305.660352] env[69648]: DEBUG oslo_vmware.api [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for the task: (returnval){ [ 1305.660352] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c1286a-c073-adf1-2acc-fbe2b4ac52a9" [ 1305.660352] env[69648]: _type = "Task" [ 1305.660352] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1305.668158] env[69648]: DEBUG oslo_vmware.api [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c1286a-c073-adf1-2acc-fbe2b4ac52a9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1306.050425] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466578, 'name': CreateVM_Task, 'duration_secs': 0.271572} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1306.050680] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1306.051227] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1306.169922] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1306.170218] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1306.170433] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1306.170641] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1306.171013] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1306.171209] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9b8e0c02-9922-4b2d-9f3b-c811b4006ace {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.175927] env[69648]: DEBUG oslo_vmware.api [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 1306.175927] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52a8abc0-7fd4-2029-ff03-982786abd24f" [ 1306.175927] env[69648]: _type = "Task" [ 1306.175927] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1306.183104] env[69648]: DEBUG oslo_vmware.api [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52a8abc0-7fd4-2029-ff03-982786abd24f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1306.685944] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1306.686165] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1306.686370] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1306.746952] env[69648]: DEBUG nova.compute.manager [req-60111c9b-0754-4f5f-9bfb-b857db817021 req-9f1f3656-39e7-41ce-95e8-d35f760fbb2b service nova] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Received event network-changed-683ba951-6968-469d-996b-303a43d8206b {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1306.747133] env[69648]: DEBUG nova.compute.manager [req-60111c9b-0754-4f5f-9bfb-b857db817021 req-9f1f3656-39e7-41ce-95e8-d35f760fbb2b service nova] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Refreshing instance network info cache due to event network-changed-683ba951-6968-469d-996b-303a43d8206b. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1306.747352] env[69648]: DEBUG oslo_concurrency.lockutils [req-60111c9b-0754-4f5f-9bfb-b857db817021 req-9f1f3656-39e7-41ce-95e8-d35f760fbb2b service nova] Acquiring lock "refresh_cache-a924bdee-1e16-4d78-ac6b-9574677de55f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1306.747514] env[69648]: DEBUG oslo_concurrency.lockutils [req-60111c9b-0754-4f5f-9bfb-b857db817021 req-9f1f3656-39e7-41ce-95e8-d35f760fbb2b service nova] Acquired lock "refresh_cache-a924bdee-1e16-4d78-ac6b-9574677de55f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1306.747688] env[69648]: DEBUG nova.network.neutron [req-60111c9b-0754-4f5f-9bfb-b857db817021 req-9f1f3656-39e7-41ce-95e8-d35f760fbb2b service nova] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Refreshing network info cache for port 683ba951-6968-469d-996b-303a43d8206b {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1307.045695] env[69648]: DEBUG nova.network.neutron [req-60111c9b-0754-4f5f-9bfb-b857db817021 req-9f1f3656-39e7-41ce-95e8-d35f760fbb2b service nova] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Updated VIF entry in instance network info cache for port 683ba951-6968-469d-996b-303a43d8206b. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1307.046065] env[69648]: DEBUG nova.network.neutron [req-60111c9b-0754-4f5f-9bfb-b857db817021 req-9f1f3656-39e7-41ce-95e8-d35f760fbb2b service nova] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Updating instance_info_cache with network_info: [{"id": "683ba951-6968-469d-996b-303a43d8206b", "address": "fa:16:3e:4e:00:0b", "network": {"id": "b0b0cb63-28dd-47ee-a694-0c4c3f19d35f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.178", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d4fe325ef395451d95fa750759fa3138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92233552-2c0c-416e-9bf3-bfcca8eda2dc", "external-id": "nsx-vlan-transportzone-251", "segmentation_id": 251, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap683ba951-69", "ovs_interfaceid": "683ba951-6968-469d-996b-303a43d8206b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1307.055675] env[69648]: DEBUG oslo_concurrency.lockutils [req-60111c9b-0754-4f5f-9bfb-b857db817021 req-9f1f3656-39e7-41ce-95e8-d35f760fbb2b service nova] Releasing lock "refresh_cache-a924bdee-1e16-4d78-ac6b-9574677de55f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1307.180842] env[69648]: DEBUG nova.compute.manager [req-8bdc6f04-8437-44be-807d-5beba3848241 req-5d2b1ba4-143c-4b6e-b97b-1e71f0e75018 service nova] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Received event network-changed-8ba5496b-0209-43dc-bf01-d969c4cafaf4 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1307.181056] env[69648]: DEBUG nova.compute.manager [req-8bdc6f04-8437-44be-807d-5beba3848241 req-5d2b1ba4-143c-4b6e-b97b-1e71f0e75018 service nova] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Refreshing instance network info cache due to event network-changed-8ba5496b-0209-43dc-bf01-d969c4cafaf4. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1307.181277] env[69648]: DEBUG oslo_concurrency.lockutils [req-8bdc6f04-8437-44be-807d-5beba3848241 req-5d2b1ba4-143c-4b6e-b97b-1e71f0e75018 service nova] Acquiring lock "refresh_cache-ba0b4adc-fa4a-4b36-bb86-58ff038c834e" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1307.181421] env[69648]: DEBUG oslo_concurrency.lockutils [req-8bdc6f04-8437-44be-807d-5beba3848241 req-5d2b1ba4-143c-4b6e-b97b-1e71f0e75018 service nova] Acquired lock "refresh_cache-ba0b4adc-fa4a-4b36-bb86-58ff038c834e" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1307.181581] env[69648]: DEBUG nova.network.neutron [req-8bdc6f04-8437-44be-807d-5beba3848241 req-5d2b1ba4-143c-4b6e-b97b-1e71f0e75018 service nova] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Refreshing network info cache for port 8ba5496b-0209-43dc-bf01-d969c4cafaf4 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1307.430365] env[69648]: DEBUG nova.network.neutron [req-8bdc6f04-8437-44be-807d-5beba3848241 req-5d2b1ba4-143c-4b6e-b97b-1e71f0e75018 service nova] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Updated VIF entry in instance network info cache for port 8ba5496b-0209-43dc-bf01-d969c4cafaf4. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1307.430718] env[69648]: DEBUG nova.network.neutron [req-8bdc6f04-8437-44be-807d-5beba3848241 req-5d2b1ba4-143c-4b6e-b97b-1e71f0e75018 service nova] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Updating instance_info_cache with network_info: [{"id": "8ba5496b-0209-43dc-bf01-d969c4cafaf4", "address": "fa:16:3e:60:0b:57", "network": {"id": "58c9db42-24fd-4615-9f91-977554db657a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-27923875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "896367398859465488fc12205d122a4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8ba5496b-02", "ovs_interfaceid": "8ba5496b-0209-43dc-bf01-d969c4cafaf4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1307.439794] env[69648]: DEBUG oslo_concurrency.lockutils [req-8bdc6f04-8437-44be-807d-5beba3848241 req-5d2b1ba4-143c-4b6e-b97b-1e71f0e75018 service nova] Releasing lock "refresh_cache-ba0b4adc-fa4a-4b36-bb86-58ff038c834e" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1308.648174] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1313.065162] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1314.065247] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1314.065587] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1315.061133] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1316.065112] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1317.065369] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1317.065718] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1318.066056] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1318.066327] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1318.066438] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1318.088990] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.088990] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.088990] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.088990] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.088990] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.089203] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.089203] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.089203] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.089203] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.089203] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1318.089340] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1319.065826] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1319.076614] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1319.076942] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1319.076994] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1319.077151] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1319.078264] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1c935e2-d981-42c6-87bd-04be66ffdebf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.087162] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee0bb92e-3c8f-4144-9305-bb2b9b94e9ae {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.101093] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f27b58cc-9dfc-4c3f-934d-a6cd5a52e83c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.107377] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bca759f-8440-4b5a-b09d-7a7ec9b49763 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.136120] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180971MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1319.136298] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1319.136490] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1319.210176] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.210351] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.210483] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.210609] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e64fd474-91ab-449e-8785-e788685ed77a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.210731] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.210853] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.210971] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ab839f84-b864-409e-883d-00dddb5db3db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.211164] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.211238] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.211357] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1319.222585] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 58804be5-ee46-4b25-be84-890d5cd1607f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1319.233795] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 74a74c62-5c24-426e-ae6f-29511de99462 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1319.243425] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60cb1a27-ecc3-43a6-8efa-b54fd2f400ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1319.255252] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1319.265137] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8426cc36-f026-46d9-844e-432343410efe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1319.274427] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 27bd5158-0b5f-408d-b91c-e7f3fbde894e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1319.284047] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0af9540b-b092-4396-8573-cdadc66abe02 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1319.293153] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0e63df98-0345-4d75-b128-291048849e40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1319.302250] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b7cd17f7-89f5-4f85-8964-532467432b59 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1319.302486] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1319.302634] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1319.512245] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a69d23ef-e140-4e0e-b41a-84484672ac97 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.519670] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fc97fb0-fe95-4ed2-a45e-a40bf9231481 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.548914] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ad6916-f9d7-4308-b82f-14fb2d2bc9ec {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.559037] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa15060a-d4e7-4f1a-bd9c-f6e3b9ac3aa3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.571456] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1319.580173] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1319.600684] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1319.600852] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.464s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1321.807499] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1321.807749] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1324.949701] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "a924bdee-1e16-4d78-ac6b-9574677de55f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1348.947586] env[69648]: WARNING oslo_vmware.rw_handles [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1348.947586] env[69648]: ERROR oslo_vmware.rw_handles [ 1348.948503] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/b51c3ba4-dbf6-467d-bbc4-cd7c7aa569cb/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1348.950826] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1348.951093] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Copying Virtual Disk [datastore1] vmware_temp/b51c3ba4-dbf6-467d-bbc4-cd7c7aa569cb/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/b51c3ba4-dbf6-467d-bbc4-cd7c7aa569cb/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1348.951398] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9628ecaa-3fd8-43fe-be8e-307247e914d5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.960382] env[69648]: DEBUG oslo_vmware.api [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Waiting for the task: (returnval){ [ 1348.960382] env[69648]: value = "task-3466579" [ 1348.960382] env[69648]: _type = "Task" [ 1348.960382] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1348.967922] env[69648]: DEBUG oslo_vmware.api [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Task: {'id': task-3466579, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1349.471300] env[69648]: DEBUG oslo_vmware.exceptions [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1349.471544] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1349.472112] env[69648]: ERROR nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1349.472112] env[69648]: Faults: ['InvalidArgument'] [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Traceback (most recent call last): [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] yield resources [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] self.driver.spawn(context, instance, image_meta, [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] self._fetch_image_if_missing(context, vi) [ 1349.472112] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] image_cache(vi, tmp_image_ds_loc) [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] vm_util.copy_virtual_disk( [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] session._wait_for_task(vmdk_copy_task) [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] return self.wait_for_task(task_ref) [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] return evt.wait() [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] result = hub.switch() [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1349.472599] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] return self.greenlet.switch() [ 1349.472924] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1349.472924] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] self.f(*self.args, **self.kw) [ 1349.472924] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1349.472924] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] raise exceptions.translate_fault(task_info.error) [ 1349.472924] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1349.472924] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Faults: ['InvalidArgument'] [ 1349.472924] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] [ 1349.472924] env[69648]: INFO nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Terminating instance [ 1349.473986] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1349.475040] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1349.475040] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8933bc9f-4a41-457f-b119-ed5d922d71f3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.476774] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1349.476944] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1349.477681] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1387e99-069c-412a-b78b-23300949fd51 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.485130] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1349.485358] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e86a6d46-0017-44f1-b24b-70b06a30fc4f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.487505] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1349.487684] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1349.488642] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-178a66f9-04e0-474e-a37c-398aa4e1b4b5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.493570] env[69648]: DEBUG oslo_vmware.api [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Waiting for the task: (returnval){ [ 1349.493570] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]521a60c8-0cb9-1c58-b73b-c3a2480835bf" [ 1349.493570] env[69648]: _type = "Task" [ 1349.493570] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1349.500749] env[69648]: DEBUG oslo_vmware.api [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]521a60c8-0cb9-1c58-b73b-c3a2480835bf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1349.563591] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1349.563806] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1349.563989] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Deleting the datastore file [datastore1] 8e6a4fd6-5f80-476d-9789-adea1be2ae72 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1349.564331] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d0cb923f-1104-4777-9240-5bb60a7adf5a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.571749] env[69648]: DEBUG oslo_vmware.api [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Waiting for the task: (returnval){ [ 1349.571749] env[69648]: value = "task-3466581" [ 1349.571749] env[69648]: _type = "Task" [ 1349.571749] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1349.579445] env[69648]: DEBUG oslo_vmware.api [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Task: {'id': task-3466581, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1350.004635] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1350.004975] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Creating directory with path [datastore1] vmware_temp/60231de0-e83b-4049-b752-cff5d4eb9e94/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1350.005135] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4aa3b7c9-79ea-499f-be3c-d791b70d5b02 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.017078] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Created directory with path [datastore1] vmware_temp/60231de0-e83b-4049-b752-cff5d4eb9e94/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1350.017271] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Fetch image to [datastore1] vmware_temp/60231de0-e83b-4049-b752-cff5d4eb9e94/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1350.017443] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/60231de0-e83b-4049-b752-cff5d4eb9e94/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1350.018182] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6027cebd-d458-4b84-a78d-cc0c987a1b30 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.024768] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9da6f599-a1d0-4ec3-ac79-6177bd56e431 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.033632] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fc0156e-167f-4f6c-a4f5-b4119fe05340 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.064805] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0278c7b-484b-4950-bdce-88f3b2ba3029 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.070297] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-222b65b5-d066-420f-9efa-05e6fc9ed739 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.079079] env[69648]: DEBUG oslo_vmware.api [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Task: {'id': task-3466581, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071055} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1350.079307] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1350.079486] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1350.079656] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1350.079822] env[69648]: INFO nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1350.081915] env[69648]: DEBUG nova.compute.claims [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1350.082094] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1350.082337] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1350.090465] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1350.219728] env[69648]: DEBUG oslo_vmware.rw_handles [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/60231de0-e83b-4049-b752-cff5d4eb9e94/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1350.279896] env[69648]: DEBUG oslo_vmware.rw_handles [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1350.280202] env[69648]: DEBUG oslo_vmware.rw_handles [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/60231de0-e83b-4049-b752-cff5d4eb9e94/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1350.413955] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69f69ae0-1b13-49d4-a679-2ad2bfa0b4ba {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.421872] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79fcc6eb-dcdc-43b9-bd20-eba90e6f0d82 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.452325] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09791ee9-3ad7-4e66-9a3c-622ee50fefcc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.459723] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb932392-aa39-4b23-aee1-b572727cfe23 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.474036] env[69648]: DEBUG nova.compute.provider_tree [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1350.482248] env[69648]: DEBUG nova.scheduler.client.report [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1350.496939] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.414s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.497492] env[69648]: ERROR nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1350.497492] env[69648]: Faults: ['InvalidArgument'] [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Traceback (most recent call last): [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] self.driver.spawn(context, instance, image_meta, [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] self._fetch_image_if_missing(context, vi) [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] image_cache(vi, tmp_image_ds_loc) [ 1350.497492] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] vm_util.copy_virtual_disk( [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] session._wait_for_task(vmdk_copy_task) [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] return self.wait_for_task(task_ref) [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] return evt.wait() [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] result = hub.switch() [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] return self.greenlet.switch() [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1350.497841] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] self.f(*self.args, **self.kw) [ 1350.498213] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1350.498213] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] raise exceptions.translate_fault(task_info.error) [ 1350.498213] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1350.498213] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Faults: ['InvalidArgument'] [ 1350.498213] env[69648]: ERROR nova.compute.manager [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] [ 1350.498213] env[69648]: DEBUG nova.compute.utils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1350.499537] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Build of instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 was re-scheduled: A specified parameter was not correct: fileType [ 1350.499537] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1350.499909] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1350.500090] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1350.500264] env[69648]: DEBUG nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1350.500453] env[69648]: DEBUG nova.network.neutron [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1350.953325] env[69648]: DEBUG nova.network.neutron [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1350.968231] env[69648]: INFO nova.compute.manager [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Took 0.47 seconds to deallocate network for instance. [ 1351.064824] env[69648]: INFO nova.scheduler.client.report [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Deleted allocations for instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 [ 1351.083216] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62231be6-1b2e-4e8d-8093-d59c7b87555e tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 622.372s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1351.084315] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 426.287s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1351.084526] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Acquiring lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1351.084732] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1351.084896] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1351.086828] env[69648]: INFO nova.compute.manager [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Terminating instance [ 1351.088424] env[69648]: DEBUG nova.compute.manager [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1351.088619] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1351.089100] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-79d11165-a9cc-4441-bb64-cdc05dc43ed2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.098166] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1159a0c2-13e5-493d-a685-1b7959558fbf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.108205] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1351.127896] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8e6a4fd6-5f80-476d-9789-adea1be2ae72 could not be found. [ 1351.128110] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1351.128283] env[69648]: INFO nova.compute.manager [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1351.128518] env[69648]: DEBUG oslo.service.loopingcall [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1351.128718] env[69648]: DEBUG nova.compute.manager [-] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1351.128813] env[69648]: DEBUG nova.network.neutron [-] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1351.151377] env[69648]: DEBUG nova.network.neutron [-] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1351.156845] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1351.157098] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1351.158461] env[69648]: INFO nova.compute.claims [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1351.161324] env[69648]: INFO nova.compute.manager [-] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] Took 0.03 seconds to deallocate network for instance. [ 1351.245534] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e7c6d092-b29f-4172-a0e8-94193526e320 tempest-ImagesOneServerNegativeTestJSON-161289767 tempest-ImagesOneServerNegativeTestJSON-161289767-project-member] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.161s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1351.246458] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 177.230s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1351.246645] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 8e6a4fd6-5f80-476d-9789-adea1be2ae72] During sync_power_state the instance has a pending task (deleting). Skip. [ 1351.246816] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "8e6a4fd6-5f80-476d-9789-adea1be2ae72" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1351.406133] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af22ba72-a6aa-4e8b-bc3f-aa06d96fdc6d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.414778] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e342feff-4c4e-4c14-85a8-6be64d00726a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.443541] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc3c5fc5-4f62-4050-89bb-b984e2fa12b0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.450466] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7887decb-a4e2-4015-9498-872217828bb6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.463582] env[69648]: DEBUG nova.compute.provider_tree [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1351.473935] env[69648]: DEBUG nova.scheduler.client.report [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1351.491140] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1351.491605] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1351.523053] env[69648]: DEBUG nova.compute.utils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1351.523756] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1351.524881] env[69648]: DEBUG nova.network.neutron [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1351.533752] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1351.595460] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1351.619197] env[69648]: DEBUG nova.policy [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6e63e22c15c457abb91ad9f4cde2983', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c24c07422cdb4ae193a0ad8fde391d7a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1351.623939] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1351.624192] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1351.624410] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1351.624645] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1351.624756] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1351.624927] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1351.625145] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1351.625324] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1351.625499] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1351.625664] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1351.625847] env[69648]: DEBUG nova.virt.hardware [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1351.626717] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a5c334a-dd74-4566-be40-879049366a6c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.635542] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d7fb018-0deb-427f-9b19-8528d4752938 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.995751] env[69648]: DEBUG nova.network.neutron [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Successfully created port: 9ba1d306-b695-46db-8de2-33d40447fa64 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1352.771238] env[69648]: DEBUG nova.network.neutron [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Successfully updated port: 9ba1d306-b695-46db-8de2-33d40447fa64 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1352.788199] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "refresh_cache-58804be5-ee46-4b25-be84-890d5cd1607f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1352.788557] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired lock "refresh_cache-58804be5-ee46-4b25-be84-890d5cd1607f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1352.788557] env[69648]: DEBUG nova.network.neutron [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1352.840140] env[69648]: DEBUG nova.network.neutron [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1352.996608] env[69648]: DEBUG nova.compute.manager [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Received event network-vif-plugged-9ba1d306-b695-46db-8de2-33d40447fa64 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1352.996832] env[69648]: DEBUG oslo_concurrency.lockutils [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] Acquiring lock "58804be5-ee46-4b25-be84-890d5cd1607f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1352.997051] env[69648]: DEBUG oslo_concurrency.lockutils [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] Lock "58804be5-ee46-4b25-be84-890d5cd1607f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1352.997226] env[69648]: DEBUG oslo_concurrency.lockutils [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] Lock "58804be5-ee46-4b25-be84-890d5cd1607f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1352.997396] env[69648]: DEBUG nova.compute.manager [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] No waiting events found dispatching network-vif-plugged-9ba1d306-b695-46db-8de2-33d40447fa64 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1352.997562] env[69648]: WARNING nova.compute.manager [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Received unexpected event network-vif-plugged-9ba1d306-b695-46db-8de2-33d40447fa64 for instance with vm_state building and task_state spawning. [ 1352.997721] env[69648]: DEBUG nova.compute.manager [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Received event network-changed-9ba1d306-b695-46db-8de2-33d40447fa64 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1352.998060] env[69648]: DEBUG nova.compute.manager [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Refreshing instance network info cache due to event network-changed-9ba1d306-b695-46db-8de2-33d40447fa64. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1352.998147] env[69648]: DEBUG oslo_concurrency.lockutils [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] Acquiring lock "refresh_cache-58804be5-ee46-4b25-be84-890d5cd1607f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1353.033907] env[69648]: DEBUG nova.network.neutron [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Updating instance_info_cache with network_info: [{"id": "9ba1d306-b695-46db-8de2-33d40447fa64", "address": "fa:16:3e:c3:a9:92", "network": {"id": "130e81f4-2301-4499-b916-449ad32b9389", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-868563699-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c24c07422cdb4ae193a0ad8fde391d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9ba1d306-b6", "ovs_interfaceid": "9ba1d306-b695-46db-8de2-33d40447fa64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1353.050271] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Releasing lock "refresh_cache-58804be5-ee46-4b25-be84-890d5cd1607f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1353.050569] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Instance network_info: |[{"id": "9ba1d306-b695-46db-8de2-33d40447fa64", "address": "fa:16:3e:c3:a9:92", "network": {"id": "130e81f4-2301-4499-b916-449ad32b9389", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-868563699-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c24c07422cdb4ae193a0ad8fde391d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9ba1d306-b6", "ovs_interfaceid": "9ba1d306-b695-46db-8de2-33d40447fa64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1353.050858] env[69648]: DEBUG oslo_concurrency.lockutils [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] Acquired lock "refresh_cache-58804be5-ee46-4b25-be84-890d5cd1607f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1353.051716] env[69648]: DEBUG nova.network.neutron [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Refreshing network info cache for port 9ba1d306-b695-46db-8de2-33d40447fa64 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1353.052086] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c3:a9:92', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f54f7284-8f7d-47ee-839d-2143062cfe44', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9ba1d306-b695-46db-8de2-33d40447fa64', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1353.060266] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating folder: Project (c24c07422cdb4ae193a0ad8fde391d7a). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1353.063160] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-db975692-8de5-4b48-bc5f-becf69e46370 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1353.075172] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Created folder: Project (c24c07422cdb4ae193a0ad8fde391d7a) in parent group-v692308. [ 1353.075387] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating folder: Instances. Parent ref: group-v692381. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1353.075616] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a4a00cb6-44e5-4ab5-ada5-2821ed7cd667 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1353.084755] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Created folder: Instances in parent group-v692381. [ 1353.084934] env[69648]: DEBUG oslo.service.loopingcall [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1353.085283] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1353.085384] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ad342b7a-bfab-4a0d-beff-f787a64c9563 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1353.106201] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1353.106201] env[69648]: value = "task-3466584" [ 1353.106201] env[69648]: _type = "Task" [ 1353.106201] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1353.113713] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466584, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1353.299456] env[69648]: DEBUG nova.network.neutron [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Updated VIF entry in instance network info cache for port 9ba1d306-b695-46db-8de2-33d40447fa64. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1353.299824] env[69648]: DEBUG nova.network.neutron [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Updating instance_info_cache with network_info: [{"id": "9ba1d306-b695-46db-8de2-33d40447fa64", "address": "fa:16:3e:c3:a9:92", "network": {"id": "130e81f4-2301-4499-b916-449ad32b9389", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-868563699-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c24c07422cdb4ae193a0ad8fde391d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9ba1d306-b6", "ovs_interfaceid": "9ba1d306-b695-46db-8de2-33d40447fa64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1353.309319] env[69648]: DEBUG oslo_concurrency.lockutils [req-1953039d-3742-4248-b0eb-3d31ddee5737 req-8320d4b9-71d0-456a-b78c-27882d817eec service nova] Releasing lock "refresh_cache-58804be5-ee46-4b25-be84-890d5cd1607f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1353.615581] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466584, 'name': CreateVM_Task, 'duration_secs': 0.387956} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1353.615763] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1353.616415] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1353.616579] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1353.616930] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1353.617188] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-797f6962-74da-4c02-83be-fb65dba213dd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1353.621433] env[69648]: DEBUG oslo_vmware.api [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 1353.621433] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5229c622-f6ba-cb0b-0c98-48d57b5e8eca" [ 1353.621433] env[69648]: _type = "Task" [ 1353.621433] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1353.633653] env[69648]: DEBUG oslo_vmware.api [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5229c622-f6ba-cb0b-0c98-48d57b5e8eca, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1354.131853] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1354.132161] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1354.132343] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1370.603815] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1371.060544] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1373.065443] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1374.064611] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1375.066053] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1377.061215] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1378.066048] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1378.067106] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1378.067106] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1379.066059] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1379.077755] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1379.077982] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1379.078184] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1379.078344] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1379.079481] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d55f598c-4fd0-4a59-bf7f-557e3df8121f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.089369] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ac29871-c4ed-48ad-8949-07154a310f24 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.102990] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd22ae1f-37e9-4d8d-9b79-1c853339f275 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.108991] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74e2105b-6aee-4897-8645-c17ea772fd36 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.137159] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180968MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1379.137305] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1379.137494] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1379.217637] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60b00251-25fc-483d-88fe-a84165d6a435 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.217809] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.217939] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e64fd474-91ab-449e-8785-e788685ed77a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.218079] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.218207] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.218327] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ab839f84-b864-409e-883d-00dddb5db3db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.218443] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.218560] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.218676] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.218793] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 58804be5-ee46-4b25-be84-890d5cd1607f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1379.231053] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 74a74c62-5c24-426e-ae6f-29511de99462 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1379.241397] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 60cb1a27-ecc3-43a6-8efa-b54fd2f400ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1379.251762] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1379.262945] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 8426cc36-f026-46d9-844e-432343410efe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1379.275125] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 27bd5158-0b5f-408d-b91c-e7f3fbde894e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1379.285144] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0af9540b-b092-4396-8573-cdadc66abe02 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1379.295864] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0e63df98-0345-4d75-b128-291048849e40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1379.305831] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b7cd17f7-89f5-4f85-8964-532467432b59 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1379.315836] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1379.316114] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1379.316243] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1379.533086] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e3fad4d-8664-46bf-bfab-629a68078b32 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.540623] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24a969f7-7773-4758-a9d1-aa17ebcbf0fa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.572098] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02e57100-7a5e-43f3-a69c-585d07f60d60 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.577487] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c21dc6f-7c5c-4dc8-974c-329ebcbb9362 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.590624] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1379.599050] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1379.613337] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1379.613517] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.476s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.614008] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1380.614273] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1380.614335] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1380.635283] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.635455] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.635560] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.635713] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.635844] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.635967] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.636102] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.636225] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.636347] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.636466] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1380.636586] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1386.595228] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1389.722865] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1389.723199] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1398.368317] env[69648]: DEBUG oslo_concurrency.lockutils [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "58804be5-ee46-4b25-be84-890d5cd1607f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1398.966261] env[69648]: WARNING oslo_vmware.rw_handles [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1398.966261] env[69648]: ERROR oslo_vmware.rw_handles [ 1398.966924] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/60231de0-e83b-4049-b752-cff5d4eb9e94/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1398.968796] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1398.969091] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Copying Virtual Disk [datastore1] vmware_temp/60231de0-e83b-4049-b752-cff5d4eb9e94/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/60231de0-e83b-4049-b752-cff5d4eb9e94/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1398.969383] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ea9625be-65ff-4fab-9ce0-4ff105685d85 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.976720] env[69648]: DEBUG oslo_vmware.api [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Waiting for the task: (returnval){ [ 1398.976720] env[69648]: value = "task-3466585" [ 1398.976720] env[69648]: _type = "Task" [ 1398.976720] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1398.984973] env[69648]: DEBUG oslo_vmware.api [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Task: {'id': task-3466585, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1399.487735] env[69648]: DEBUG oslo_vmware.exceptions [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1399.488050] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1399.488581] env[69648]: ERROR nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1399.488581] env[69648]: Faults: ['InvalidArgument'] [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Traceback (most recent call last): [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] yield resources [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] self.driver.spawn(context, instance, image_meta, [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] self._fetch_image_if_missing(context, vi) [ 1399.488581] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] image_cache(vi, tmp_image_ds_loc) [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] vm_util.copy_virtual_disk( [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] session._wait_for_task(vmdk_copy_task) [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] return self.wait_for_task(task_ref) [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] return evt.wait() [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] result = hub.switch() [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1399.488991] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] return self.greenlet.switch() [ 1399.489358] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1399.489358] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] self.f(*self.args, **self.kw) [ 1399.489358] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1399.489358] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] raise exceptions.translate_fault(task_info.error) [ 1399.489358] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1399.489358] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Faults: ['InvalidArgument'] [ 1399.489358] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] [ 1399.489358] env[69648]: INFO nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Terminating instance [ 1399.490445] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1399.490654] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1399.491293] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1399.491480] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1399.491701] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9394fe91-1deb-4474-8844-3ec9ae60577b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.493944] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cec18110-a9bb-4b80-b531-08728dc6d5c6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.500363] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1399.500584] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6dcbac92-e94c-4c97-99b0-10c967d585d4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.503856] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1399.504037] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1399.504675] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5be2459b-f2f2-4c0f-8e99-958406fa694d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.509216] env[69648]: DEBUG oslo_vmware.api [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Waiting for the task: (returnval){ [ 1399.509216] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52627510-aee8-cb62-8a2f-c0ec76bcfb68" [ 1399.509216] env[69648]: _type = "Task" [ 1399.509216] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1399.516511] env[69648]: DEBUG oslo_vmware.api [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52627510-aee8-cb62-8a2f-c0ec76bcfb68, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1399.564392] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1399.564625] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1399.564865] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Deleting the datastore file [datastore1] 60b00251-25fc-483d-88fe-a84165d6a435 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1399.565168] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6cc0ed03-5267-4b94-9048-4afb9cce3fd0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.571272] env[69648]: DEBUG oslo_vmware.api [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Waiting for the task: (returnval){ [ 1399.571272] env[69648]: value = "task-3466587" [ 1399.571272] env[69648]: _type = "Task" [ 1399.571272] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1399.578672] env[69648]: DEBUG oslo_vmware.api [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Task: {'id': task-3466587, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1400.019685] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1400.019945] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Creating directory with path [datastore1] vmware_temp/94f025bd-222c-424d-837e-f85ee9dd225a/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1400.020195] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f2beec53-6238-47d2-b2f1-15aeb896a2c7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.033685] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Created directory with path [datastore1] vmware_temp/94f025bd-222c-424d-837e-f85ee9dd225a/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1400.033871] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Fetch image to [datastore1] vmware_temp/94f025bd-222c-424d-837e-f85ee9dd225a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1400.034053] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/94f025bd-222c-424d-837e-f85ee9dd225a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1400.034761] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-419fc670-a5b0-4fe1-998e-7eccd16b5bc6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.041070] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-885cedc6-c4ec-49ad-83cc-149637339ed7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.049812] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee616fa1-0b22-4ee2-835e-2f81617ab758 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.083619] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8c9b49d-cfc2-4844-ae90-05de22af26f3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.090658] env[69648]: DEBUG oslo_vmware.api [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Task: {'id': task-3466587, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.086592} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1400.092068] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1400.092267] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1400.092493] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1400.092691] env[69648]: INFO nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1400.094380] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cf919036-0d02-4a65-9382-ef2cc929631c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.096310] env[69648]: DEBUG nova.compute.claims [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1400.096483] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1400.096694] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1400.119063] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1400.175263] env[69648]: DEBUG oslo_vmware.rw_handles [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/94f025bd-222c-424d-837e-f85ee9dd225a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1400.235422] env[69648]: DEBUG oslo_vmware.rw_handles [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1400.235631] env[69648]: DEBUG oslo_vmware.rw_handles [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/94f025bd-222c-424d-837e-f85ee9dd225a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1400.407686] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d04c879e-fb68-48b0-a9ca-c69a2c0dd6d6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.415733] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af70411e-19dd-4a10-9a7d-8bf62e88d432 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.445946] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-820ff4fa-accf-4604-977a-e0c163f66a71 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.452563] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63385792-f693-4b14-9b2e-2ac8e67d24fe {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.465491] env[69648]: DEBUG nova.compute.provider_tree [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1400.473761] env[69648]: DEBUG nova.scheduler.client.report [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1400.487726] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.391s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1400.489028] env[69648]: ERROR nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1400.489028] env[69648]: Faults: ['InvalidArgument'] [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Traceback (most recent call last): [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] self.driver.spawn(context, instance, image_meta, [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] self._fetch_image_if_missing(context, vi) [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] image_cache(vi, tmp_image_ds_loc) [ 1400.489028] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] vm_util.copy_virtual_disk( [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] session._wait_for_task(vmdk_copy_task) [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] return self.wait_for_task(task_ref) [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] return evt.wait() [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] result = hub.switch() [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] return self.greenlet.switch() [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1400.489530] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] self.f(*self.args, **self.kw) [ 1400.489806] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1400.489806] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] raise exceptions.translate_fault(task_info.error) [ 1400.489806] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1400.489806] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Faults: ['InvalidArgument'] [ 1400.489806] env[69648]: ERROR nova.compute.manager [instance: 60b00251-25fc-483d-88fe-a84165d6a435] [ 1400.489806] env[69648]: DEBUG nova.compute.utils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1400.490661] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Build of instance 60b00251-25fc-483d-88fe-a84165d6a435 was re-scheduled: A specified parameter was not correct: fileType [ 1400.490661] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1400.491041] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1400.491230] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1400.491399] env[69648]: DEBUG nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1400.491582] env[69648]: DEBUG nova.network.neutron [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1400.896398] env[69648]: DEBUG nova.network.neutron [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1400.908652] env[69648]: INFO nova.compute.manager [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Took 0.42 seconds to deallocate network for instance. [ 1401.012019] env[69648]: INFO nova.scheduler.client.report [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Deleted allocations for instance 60b00251-25fc-483d-88fe-a84165d6a435 [ 1401.035309] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e413756-f8e6-4b65-86ba-a69354555534 tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "60b00251-25fc-483d-88fe-a84165d6a435" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 593.652s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.036661] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "60b00251-25fc-483d-88fe-a84165d6a435" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 396.930s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.036965] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Acquiring lock "60b00251-25fc-483d-88fe-a84165d6a435-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1401.037207] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "60b00251-25fc-483d-88fe-a84165d6a435-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.037381] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "60b00251-25fc-483d-88fe-a84165d6a435-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.039353] env[69648]: INFO nova.compute.manager [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Terminating instance [ 1401.040769] env[69648]: DEBUG nova.compute.manager [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1401.040983] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1401.041597] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-14d70366-7304-4d52-8516-24fee53421da {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.051689] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95a16290-e44b-4887-8355-ebca2c6ac7f0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.062295] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 74a74c62-5c24-426e-ae6f-29511de99462] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1401.081971] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 60b00251-25fc-483d-88fe-a84165d6a435 could not be found. [ 1401.082199] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1401.082379] env[69648]: INFO nova.compute.manager [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1401.082608] env[69648]: DEBUG oslo.service.loopingcall [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1401.082898] env[69648]: DEBUG nova.compute.manager [-] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1401.082957] env[69648]: DEBUG nova.network.neutron [-] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1401.088993] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 74a74c62-5c24-426e-ae6f-29511de99462] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1401.107498] env[69648]: DEBUG nova.network.neutron [-] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1401.111173] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "74a74c62-5c24-426e-ae6f-29511de99462" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.228s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.115671] env[69648]: INFO nova.compute.manager [-] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] Took 0.03 seconds to deallocate network for instance. [ 1401.121024] env[69648]: DEBUG nova.compute.manager [None req-8972f900-08bd-4550-9208-40a1747a698a tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 60cb1a27-ecc3-43a6-8efa-b54fd2f400ba] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1401.145060] env[69648]: DEBUG nova.compute.manager [None req-8972f900-08bd-4550-9208-40a1747a698a tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 60cb1a27-ecc3-43a6-8efa-b54fd2f400ba] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1401.168395] env[69648]: DEBUG oslo_concurrency.lockutils [None req-8972f900-08bd-4550-9208-40a1747a698a tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "60cb1a27-ecc3-43a6-8efa-b54fd2f400ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.127s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.177486] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1401.242633] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4353a3a8-8f92-403d-80bc-c2fb8996c37f tempest-VolumesAdminNegativeTest-919289879 tempest-VolumesAdminNegativeTest-919289879-project-member] Lock "60b00251-25fc-483d-88fe-a84165d6a435" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.206s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.243357] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "60b00251-25fc-483d-88fe-a84165d6a435" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 227.226s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.243559] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 60b00251-25fc-483d-88fe-a84165d6a435] During sync_power_state the instance has a pending task (deleting). Skip. [ 1401.243741] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "60b00251-25fc-483d-88fe-a84165d6a435" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.244741] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1401.245011] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.246947] env[69648]: INFO nova.compute.claims [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1401.502320] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d701b224-b5d3-4c0c-8c76-d58895b16b88 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.509701] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4475363-8951-452d-a021-8943a1b3dce0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.539730] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ce20e5f-b6fa-45aa-894d-b76601b8766a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.546882] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-809b5698-db5b-4bb5-81c2-377507a2c7c7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.560069] env[69648]: DEBUG nova.compute.provider_tree [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1401.568250] env[69648]: DEBUG nova.scheduler.client.report [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1401.586950] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.342s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.587553] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1401.622014] env[69648]: DEBUG nova.compute.utils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1401.623540] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1401.623717] env[69648]: DEBUG nova.network.neutron [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1401.632824] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1401.675887] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "114bdafc-21f6-4a77-bf19-a444cbd8806c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1401.676144] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.681097] env[69648]: DEBUG nova.policy [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b3844bce7b7e4ab3bbb74091f772339a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b58979c5a8284a83a55a5e017b603a0d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1401.702963] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1401.725168] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1401.725498] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1401.725669] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1401.725813] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1401.725997] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1401.726169] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1401.726382] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1401.726544] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1401.726710] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1401.726895] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1401.727063] env[69648]: DEBUG nova.virt.hardware [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1401.728182] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58d6c9a2-3b96-4bf5-87dc-1b412a93a2fa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.736317] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bc26b3f-23ba-4a05-bc02-402241f01210 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.073641] env[69648]: DEBUG nova.network.neutron [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Successfully created port: 1b0a8a91-8b54-4564-9455-dd5c27c69260 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1402.697752] env[69648]: DEBUG nova.network.neutron [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Successfully updated port: 1b0a8a91-8b54-4564-9455-dd5c27c69260 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1402.711476] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquiring lock "refresh_cache-c97308be-406b-4fd0-b502-69e8c800773f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1402.711476] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquired lock "refresh_cache-c97308be-406b-4fd0-b502-69e8c800773f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1402.712476] env[69648]: DEBUG nova.network.neutron [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1402.748694] env[69648]: DEBUG nova.network.neutron [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1402.944568] env[69648]: DEBUG nova.network.neutron [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Updating instance_info_cache with network_info: [{"id": "1b0a8a91-8b54-4564-9455-dd5c27c69260", "address": "fa:16:3e:23:59:bc", "network": {"id": "84dcde41-809a-41f2-ac42-e8512478536b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1769663596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b58979c5a8284a83a55a5e017b603a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3734b156-0f7d-4721-b23c-d000412ec2eb", "external-id": "nsx-vlan-transportzone-560", "segmentation_id": 560, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b0a8a91-8b", "ovs_interfaceid": "1b0a8a91-8b54-4564-9455-dd5c27c69260", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1402.956460] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Releasing lock "refresh_cache-c97308be-406b-4fd0-b502-69e8c800773f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1402.956768] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Instance network_info: |[{"id": "1b0a8a91-8b54-4564-9455-dd5c27c69260", "address": "fa:16:3e:23:59:bc", "network": {"id": "84dcde41-809a-41f2-ac42-e8512478536b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1769663596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b58979c5a8284a83a55a5e017b603a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3734b156-0f7d-4721-b23c-d000412ec2eb", "external-id": "nsx-vlan-transportzone-560", "segmentation_id": 560, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b0a8a91-8b", "ovs_interfaceid": "1b0a8a91-8b54-4564-9455-dd5c27c69260", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1402.957317] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:23:59:bc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3734b156-0f7d-4721-b23c-d000412ec2eb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1b0a8a91-8b54-4564-9455-dd5c27c69260', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1402.965339] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Creating folder: Project (b58979c5a8284a83a55a5e017b603a0d). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1402.965909] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3566815e-e311-4ae2-ad74-693009ef4b0b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.977020] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Created folder: Project (b58979c5a8284a83a55a5e017b603a0d) in parent group-v692308. [ 1402.977248] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Creating folder: Instances. Parent ref: group-v692384. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1402.977489] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-78ffc77f-0fe7-4b25-9bd5-3ebfb5608fb7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.986126] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Created folder: Instances in parent group-v692384. [ 1402.986362] env[69648]: DEBUG oslo.service.loopingcall [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1402.986541] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1402.986735] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3349069e-8a79-460c-877a-fafdbc84c398 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1403.006559] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1403.006559] env[69648]: value = "task-3466590" [ 1403.006559] env[69648]: _type = "Task" [ 1403.006559] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1403.014745] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466590, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1403.033073] env[69648]: DEBUG nova.compute.manager [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Received event network-vif-plugged-1b0a8a91-8b54-4564-9455-dd5c27c69260 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1403.033291] env[69648]: DEBUG oslo_concurrency.lockutils [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] Acquiring lock "c97308be-406b-4fd0-b502-69e8c800773f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1403.033502] env[69648]: DEBUG oslo_concurrency.lockutils [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] Lock "c97308be-406b-4fd0-b502-69e8c800773f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1403.033680] env[69648]: DEBUG oslo_concurrency.lockutils [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] Lock "c97308be-406b-4fd0-b502-69e8c800773f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1403.033846] env[69648]: DEBUG nova.compute.manager [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] [instance: c97308be-406b-4fd0-b502-69e8c800773f] No waiting events found dispatching network-vif-plugged-1b0a8a91-8b54-4564-9455-dd5c27c69260 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1403.034023] env[69648]: WARNING nova.compute.manager [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Received unexpected event network-vif-plugged-1b0a8a91-8b54-4564-9455-dd5c27c69260 for instance with vm_state building and task_state spawning. [ 1403.034192] env[69648]: DEBUG nova.compute.manager [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Received event network-changed-1b0a8a91-8b54-4564-9455-dd5c27c69260 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1403.034368] env[69648]: DEBUG nova.compute.manager [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Refreshing instance network info cache due to event network-changed-1b0a8a91-8b54-4564-9455-dd5c27c69260. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1403.034563] env[69648]: DEBUG oslo_concurrency.lockutils [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] Acquiring lock "refresh_cache-c97308be-406b-4fd0-b502-69e8c800773f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1403.034703] env[69648]: DEBUG oslo_concurrency.lockutils [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] Acquired lock "refresh_cache-c97308be-406b-4fd0-b502-69e8c800773f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1403.034912] env[69648]: DEBUG nova.network.neutron [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Refreshing network info cache for port 1b0a8a91-8b54-4564-9455-dd5c27c69260 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1403.326812] env[69648]: DEBUG nova.network.neutron [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Updated VIF entry in instance network info cache for port 1b0a8a91-8b54-4564-9455-dd5c27c69260. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1403.327182] env[69648]: DEBUG nova.network.neutron [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Updating instance_info_cache with network_info: [{"id": "1b0a8a91-8b54-4564-9455-dd5c27c69260", "address": "fa:16:3e:23:59:bc", "network": {"id": "84dcde41-809a-41f2-ac42-e8512478536b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1769663596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b58979c5a8284a83a55a5e017b603a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3734b156-0f7d-4721-b23c-d000412ec2eb", "external-id": "nsx-vlan-transportzone-560", "segmentation_id": 560, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b0a8a91-8b", "ovs_interfaceid": "1b0a8a91-8b54-4564-9455-dd5c27c69260", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1403.336424] env[69648]: DEBUG oslo_concurrency.lockutils [req-1567b9bc-bb11-4686-8b02-a319415caf97 req-9cc6d493-efca-445a-93a8-1490018ebf6e service nova] Releasing lock "refresh_cache-c97308be-406b-4fd0-b502-69e8c800773f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1403.516545] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466590, 'name': CreateVM_Task, 'duration_secs': 0.301574} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1403.516739] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1403.517425] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1403.517585] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1403.517917] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1403.518185] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-57286a28-a90e-4e97-839f-f24fd8b71a50 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1403.522541] env[69648]: DEBUG oslo_vmware.api [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Waiting for the task: (returnval){ [ 1403.522541] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5239cbc1-13b6-ddb0-ca0a-3868bc4b8dc5" [ 1403.522541] env[69648]: _type = "Task" [ 1403.522541] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1403.536727] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1403.536956] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1403.537182] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1405.047325] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquiring lock "c97308be-406b-4fd0-b502-69e8c800773f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1422.478120] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquiring lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1422.478459] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1423.609859] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1423.610177] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1431.065627] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1434.065775] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1435.065998] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1436.065595] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1437.066054] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1438.067548] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1438.067850] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1438.067978] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1438.068159] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1438.068352] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 1438.079515] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] There are 0 instances to clean {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 1439.847046] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9385912d-383d-40d0-ac55-1e66f7f4de1b tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "97ce6d4b-ad90-47c7-885a-1f6632c8b97d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1439.847326] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9385912d-383d-40d0-ac55-1e66f7f4de1b tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "97ce6d4b-ad90-47c7-885a-1f6632c8b97d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1440.077175] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1441.065476] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1441.084907] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1441.085170] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1441.085426] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1441.085571] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1441.086704] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ff51fa4-3cc6-49ae-84e7-f64845bb0f5c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1441.095641] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dc53ecd-5fb9-48a9-bf37-038e2822ce97 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1441.109994] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1486e92b-ce11-47ad-ae2e-c46c86702824 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1441.116740] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25b03c1f-4a23-4517-8cc8-eccc0d5ce3a6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1441.145843] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180972MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1441.146016] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1441.146249] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1441.269827] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6062dd02-230d-42bc-8304-fc122f1f1489 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.269993] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance e64fd474-91ab-449e-8785-e788685ed77a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.270168] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.270302] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.270439] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ab839f84-b864-409e-883d-00dddb5db3db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.270600] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.270726] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.270847] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.270966] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 58804be5-ee46-4b25-be84-890d5cd1607f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.271109] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1441.286740] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance b7cd17f7-89f5-4f85-8964-532467432b59 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1441.302424] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1441.313651] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1441.327804] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1441.338610] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1441.355993] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1441.368367] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 97ce6d4b-ad90-47c7-885a-1f6632c8b97d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1441.368367] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1441.368367] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1441.387366] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing inventories for resource provider d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1441.404471] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating ProviderTree inventory for provider d38a352b-7808-44da-8216-792e96aadc88 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1441.404676] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating inventory in ProviderTree for provider d38a352b-7808-44da-8216-792e96aadc88 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1441.417263] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing aggregate associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, aggregates: None {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1441.443539] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing trait associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1441.638547] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-535b5780-25df-45e7-bfc2-53810c8826be {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1441.646300] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-407ee6a4-2481-4b4c-a6a6-b6ad641a59d1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1441.676968] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bae2cb43-34a8-4d4b-b5f1-6d2947cd5112 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1441.684234] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b598133-cf90-4759-b074-dd99a15a09b1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1441.697224] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1441.706322] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1441.720270] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1441.720668] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1441.720668] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1441.721026] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances with incomplete migration {{(pid=69648) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 1442.729730] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1442.730023] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1442.730098] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1442.752402] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.752551] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.752682] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.752811] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.752938] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.753078] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.753238] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.753370] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.753492] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.753612] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1442.753733] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1449.610064] env[69648]: WARNING oslo_vmware.rw_handles [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1449.610064] env[69648]: ERROR oslo_vmware.rw_handles [ 1449.610661] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/94f025bd-222c-424d-837e-f85ee9dd225a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1449.612505] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1449.612750] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Copying Virtual Disk [datastore1] vmware_temp/94f025bd-222c-424d-837e-f85ee9dd225a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/94f025bd-222c-424d-837e-f85ee9dd225a/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1449.613036] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dd800cb4-6f3d-44c7-ad4f-fa4f7aa36696 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.620726] env[69648]: DEBUG oslo_vmware.api [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Waiting for the task: (returnval){ [ 1449.620726] env[69648]: value = "task-3466591" [ 1449.620726] env[69648]: _type = "Task" [ 1449.620726] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1449.629785] env[69648]: DEBUG oslo_vmware.api [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Task: {'id': task-3466591, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1450.130426] env[69648]: DEBUG oslo_vmware.exceptions [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1450.130671] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1450.131242] env[69648]: ERROR nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1450.131242] env[69648]: Faults: ['InvalidArgument'] [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Traceback (most recent call last): [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] yield resources [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self.driver.spawn(context, instance, image_meta, [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self._fetch_image_if_missing(context, vi) [ 1450.131242] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] image_cache(vi, tmp_image_ds_loc) [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] vm_util.copy_virtual_disk( [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] session._wait_for_task(vmdk_copy_task) [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return self.wait_for_task(task_ref) [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return evt.wait() [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] result = hub.switch() [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1450.131636] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return self.greenlet.switch() [ 1450.131991] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1450.131991] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self.f(*self.args, **self.kw) [ 1450.131991] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1450.131991] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] raise exceptions.translate_fault(task_info.error) [ 1450.131991] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1450.131991] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Faults: ['InvalidArgument'] [ 1450.131991] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] [ 1450.131991] env[69648]: INFO nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Terminating instance [ 1450.133273] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1450.133396] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1450.133879] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "refresh_cache-6062dd02-230d-42bc-8304-fc122f1f1489" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1450.134044] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquired lock "refresh_cache-6062dd02-230d-42bc-8304-fc122f1f1489" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1450.134213] env[69648]: DEBUG nova.network.neutron [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1450.135115] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d2383c44-eda9-45c8-a646-2c2c30b8e940 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.804225] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1450.804471] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1450.805414] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c0383dfa-42ef-4e9c-a4f1-982334170ee0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.810313] env[69648]: DEBUG oslo_vmware.api [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Waiting for the task: (returnval){ [ 1450.810313] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52084f2c-c2ef-b464-b616-adabc301a29a" [ 1450.810313] env[69648]: _type = "Task" [ 1450.810313] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1450.817859] env[69648]: DEBUG oslo_vmware.api [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52084f2c-c2ef-b464-b616-adabc301a29a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1450.818582] env[69648]: DEBUG nova.network.neutron [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1450.904467] env[69648]: DEBUG nova.network.neutron [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1450.913539] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Releasing lock "refresh_cache-6062dd02-230d-42bc-8304-fc122f1f1489" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1450.914111] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1450.914315] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1450.915450] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61862f09-fc2a-4ca7-a732-a6e737cd2d8b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.923306] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1450.923506] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2b33822f-a997-4d74-84b7-efa57695b6bb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.955759] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1450.955983] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1450.956179] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Deleting the datastore file [datastore1] 6062dd02-230d-42bc-8304-fc122f1f1489 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1450.956462] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d22400fb-8c06-49da-850a-9684189828b4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.962144] env[69648]: DEBUG oslo_vmware.api [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Waiting for the task: (returnval){ [ 1450.962144] env[69648]: value = "task-3466593" [ 1450.962144] env[69648]: _type = "Task" [ 1450.962144] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1450.969703] env[69648]: DEBUG oslo_vmware.api [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Task: {'id': task-3466593, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1451.321166] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1451.321417] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Creating directory with path [datastore1] vmware_temp/d5c3e5d0-7368-4ceb-95c1-e0b7f764ddbe/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1451.321648] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1e848188-0289-4aed-b41d-1fedd4dcd604 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.334746] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Created directory with path [datastore1] vmware_temp/d5c3e5d0-7368-4ceb-95c1-e0b7f764ddbe/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1451.334950] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Fetch image to [datastore1] vmware_temp/d5c3e5d0-7368-4ceb-95c1-e0b7f764ddbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1451.335136] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/d5c3e5d0-7368-4ceb-95c1-e0b7f764ddbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1451.335893] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39b0696f-41b8-4806-8eae-cb7a9c396a03 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.342425] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06ad9774-a205-4a3f-95d2-1a2489ba2df3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.351140] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf541ced-259d-4d9f-b6df-d00ccf7e7035 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.382439] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec66551d-6a19-409e-9825-f6c4f9991814 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.387972] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b1bfa676-4400-46bf-8151-c87a4bc8f0de {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.407944] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1451.456645] env[69648]: DEBUG oslo_vmware.rw_handles [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d5c3e5d0-7368-4ceb-95c1-e0b7f764ddbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1451.517061] env[69648]: DEBUG oslo_vmware.rw_handles [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1451.517264] env[69648]: DEBUG oslo_vmware.rw_handles [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d5c3e5d0-7368-4ceb-95c1-e0b7f764ddbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1451.520872] env[69648]: DEBUG oslo_vmware.api [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Task: {'id': task-3466593, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034657} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1451.521155] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1451.521369] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1451.521565] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1451.521740] env[69648]: INFO nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1451.521987] env[69648]: DEBUG oslo.service.loopingcall [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1451.522243] env[69648]: DEBUG nova.compute.manager [-] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1451.524657] env[69648]: DEBUG nova.compute.claims [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1451.524855] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1451.525115] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1451.744250] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d3d3ce-b8fc-4b20-93f1-ec5860bc19ab {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.751521] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58c3c315-6230-47aa-a737-6f60becb653c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.780033] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c4dcf0f-793e-43c2-96e2-dcae680d4df6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.786616] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b5daa75-cd21-445f-9b38-93c25123d1d4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.799254] env[69648]: DEBUG nova.compute.provider_tree [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1451.807285] env[69648]: DEBUG nova.scheduler.client.report [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1451.821670] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.296s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1451.822215] env[69648]: ERROR nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1451.822215] env[69648]: Faults: ['InvalidArgument'] [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Traceback (most recent call last): [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self.driver.spawn(context, instance, image_meta, [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self._fetch_image_if_missing(context, vi) [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] image_cache(vi, tmp_image_ds_loc) [ 1451.822215] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] vm_util.copy_virtual_disk( [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] session._wait_for_task(vmdk_copy_task) [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return self.wait_for_task(task_ref) [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return evt.wait() [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] result = hub.switch() [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return self.greenlet.switch() [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1451.822527] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self.f(*self.args, **self.kw) [ 1451.822809] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1451.822809] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] raise exceptions.translate_fault(task_info.error) [ 1451.822809] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1451.822809] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Faults: ['InvalidArgument'] [ 1451.822809] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] [ 1451.822943] env[69648]: DEBUG nova.compute.utils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1451.825503] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Build of instance 6062dd02-230d-42bc-8304-fc122f1f1489 was re-scheduled: A specified parameter was not correct: fileType [ 1451.825503] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1451.825885] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1451.826125] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "refresh_cache-6062dd02-230d-42bc-8304-fc122f1f1489" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1451.826280] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquired lock "refresh_cache-6062dd02-230d-42bc-8304-fc122f1f1489" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1451.826482] env[69648]: DEBUG nova.network.neutron [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1451.850759] env[69648]: DEBUG nova.network.neutron [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1451.911027] env[69648]: DEBUG nova.network.neutron [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1451.920146] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Releasing lock "refresh_cache-6062dd02-230d-42bc-8304-fc122f1f1489" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1451.920469] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1451.920687] env[69648]: DEBUG nova.compute.manager [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1452.026850] env[69648]: INFO nova.scheduler.client.report [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Deleted allocations for instance 6062dd02-230d-42bc-8304-fc122f1f1489 [ 1452.044344] env[69648]: DEBUG oslo_concurrency.lockutils [None req-333cc04b-980d-4084-9d7d-d98c653b353a tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "6062dd02-230d-42bc-8304-fc122f1f1489" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 636.359s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.045473] env[69648]: DEBUG oslo_concurrency.lockutils [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "6062dd02-230d-42bc-8304-fc122f1f1489" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 435.226s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1452.045701] env[69648]: DEBUG oslo_concurrency.lockutils [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "6062dd02-230d-42bc-8304-fc122f1f1489-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1452.045903] env[69648]: DEBUG oslo_concurrency.lockutils [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "6062dd02-230d-42bc-8304-fc122f1f1489-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1452.046080] env[69648]: DEBUG oslo_concurrency.lockutils [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "6062dd02-230d-42bc-8304-fc122f1f1489-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.048209] env[69648]: INFO nova.compute.manager [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Terminating instance [ 1452.050038] env[69648]: DEBUG oslo_concurrency.lockutils [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquiring lock "refresh_cache-6062dd02-230d-42bc-8304-fc122f1f1489" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1452.050249] env[69648]: DEBUG oslo_concurrency.lockutils [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Acquired lock "refresh_cache-6062dd02-230d-42bc-8304-fc122f1f1489" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1452.050463] env[69648]: DEBUG nova.network.neutron [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1452.059895] env[69648]: DEBUG nova.compute.manager [None req-4c083434-c107-4bfd-bcc5-68e9c3b0983f tempest-ServerShowV254Test-458023032 tempest-ServerShowV254Test-458023032-project-member] [instance: 8426cc36-f026-46d9-844e-432343410efe] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1452.077660] env[69648]: DEBUG nova.network.neutron [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1452.088497] env[69648]: DEBUG nova.compute.manager [None req-4c083434-c107-4bfd-bcc5-68e9c3b0983f tempest-ServerShowV254Test-458023032 tempest-ServerShowV254Test-458023032-project-member] [instance: 8426cc36-f026-46d9-844e-432343410efe] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1452.109957] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4c083434-c107-4bfd-bcc5-68e9c3b0983f tempest-ServerShowV254Test-458023032 tempest-ServerShowV254Test-458023032-project-member] Lock "8426cc36-f026-46d9-844e-432343410efe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.962s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.122314] env[69648]: DEBUG nova.compute.manager [None req-7e6c10fe-2829-47ed-91bc-db2424e0e8cd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 27bd5158-0b5f-408d-b91c-e7f3fbde894e] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1452.146858] env[69648]: DEBUG nova.compute.manager [None req-7e6c10fe-2829-47ed-91bc-db2424e0e8cd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 27bd5158-0b5f-408d-b91c-e7f3fbde894e] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1452.156836] env[69648]: DEBUG nova.network.neutron [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1452.164825] env[69648]: DEBUG oslo_concurrency.lockutils [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Releasing lock "refresh_cache-6062dd02-230d-42bc-8304-fc122f1f1489" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1452.165265] env[69648]: DEBUG nova.compute.manager [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1452.165504] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1452.166050] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d3548c7b-0745-4a8c-8ef7-d0adefa86c71 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.171576] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7e6c10fe-2829-47ed-91bc-db2424e0e8cd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "27bd5158-0b5f-408d-b91c-e7f3fbde894e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.177054] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b15f23fb-a2cf-4f1f-95e9-0a702f464d12 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.188414] env[69648]: DEBUG nova.compute.manager [None req-1e59b601-1e4b-4be2-a779-3a16609aa839 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 0af9540b-b092-4396-8573-cdadc66abe02] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1452.208373] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6062dd02-230d-42bc-8304-fc122f1f1489 could not be found. [ 1452.208574] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1452.208749] env[69648]: INFO nova.compute.manager [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1452.208987] env[69648]: DEBUG oslo.service.loopingcall [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1452.209215] env[69648]: DEBUG nova.compute.manager [-] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1452.209311] env[69648]: DEBUG nova.network.neutron [-] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1452.211920] env[69648]: DEBUG nova.compute.manager [None req-1e59b601-1e4b-4be2-a779-3a16609aa839 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 0af9540b-b092-4396-8573-cdadc66abe02] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1452.230858] env[69648]: DEBUG oslo_concurrency.lockutils [None req-1e59b601-1e4b-4be2-a779-3a16609aa839 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "0af9540b-b092-4396-8573-cdadc66abe02" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.032s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.238984] env[69648]: DEBUG nova.compute.manager [None req-21e577e4-a8c6-41b2-9138-20ab717439fe tempest-ServerRescueTestJSON-1037422394 tempest-ServerRescueTestJSON-1037422394-project-member] [instance: 0e63df98-0345-4d75-b128-291048849e40] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1452.264413] env[69648]: DEBUG nova.compute.manager [None req-21e577e4-a8c6-41b2-9138-20ab717439fe tempest-ServerRescueTestJSON-1037422394 tempest-ServerRescueTestJSON-1037422394-project-member] [instance: 0e63df98-0345-4d75-b128-291048849e40] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1452.285347] env[69648]: DEBUG oslo_concurrency.lockutils [None req-21e577e4-a8c6-41b2-9138-20ab717439fe tempest-ServerRescueTestJSON-1037422394 tempest-ServerRescueTestJSON-1037422394-project-member] Lock "0e63df98-0345-4d75-b128-291048849e40" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.498s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.294820] env[69648]: DEBUG nova.compute.manager [None req-afa03ea8-6f97-41f2-bc03-199d24794c07 tempest-ServersV294TestFqdnHostnames-720767939 tempest-ServersV294TestFqdnHostnames-720767939-project-member] [instance: b7cd17f7-89f5-4f85-8964-532467432b59] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1452.317806] env[69648]: DEBUG nova.compute.manager [None req-afa03ea8-6f97-41f2-bc03-199d24794c07 tempest-ServersV294TestFqdnHostnames-720767939 tempest-ServersV294TestFqdnHostnames-720767939-project-member] [instance: b7cd17f7-89f5-4f85-8964-532467432b59] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1452.340623] env[69648]: DEBUG oslo_concurrency.lockutils [None req-afa03ea8-6f97-41f2-bc03-199d24794c07 tempest-ServersV294TestFqdnHostnames-720767939 tempest-ServersV294TestFqdnHostnames-720767939-project-member] Lock "b7cd17f7-89f5-4f85-8964-532467432b59" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.170s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.351477] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1452.358043] env[69648]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=69648) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1452.358366] env[69648]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-f05b3464-456b-46e8-8b4b-7bb7c73e1883'] [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1452.359153] env[69648]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1452.359530] env[69648]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1452.359923] env[69648]: ERROR oslo.service.loopingcall [ 1452.360304] env[69648]: ERROR nova.compute.manager [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1452.394094] env[69648]: ERROR nova.compute.manager [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Traceback (most recent call last): [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] ret = obj(*args, **kwargs) [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] exception_handler_v20(status_code, error_body) [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] raise client_exc(message=error_message, [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Neutron server returns request_ids: ['req-f05b3464-456b-46e8-8b4b-7bb7c73e1883'] [ 1452.394094] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] During handling of the above exception, another exception occurred: [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Traceback (most recent call last): [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self._delete_instance(context, instance, bdms) [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self._shutdown_instance(context, instance, bdms) [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self._try_deallocate_network(context, instance, requested_networks) [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] with excutils.save_and_reraise_exception(): [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1452.394668] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self.force_reraise() [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] raise self.value [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] _deallocate_network_with_retries() [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return evt.wait() [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] result = hub.switch() [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return self.greenlet.switch() [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1452.395088] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] result = func(*self.args, **self.kw) [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] result = f(*args, **kwargs) [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self._deallocate_network( [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self.network_api.deallocate_for_instance( [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] data = neutron.list_ports(**search_opts) [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] ret = obj(*args, **kwargs) [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return self.list('ports', self.ports_path, retrieve_all, [ 1452.395650] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] ret = obj(*args, **kwargs) [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] for r in self._pagination(collection, path, **params): [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] res = self.get(path, params=params) [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] ret = obj(*args, **kwargs) [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return self.retry_request("GET", action, body=body, [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] ret = obj(*args, **kwargs) [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1452.396029] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] return self.do_request(method, action, body=body, [ 1452.396481] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.396481] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] ret = obj(*args, **kwargs) [ 1452.396481] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1452.396481] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] self._handle_fault_response(status_code, replybody, resp) [ 1452.396481] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1452.396481] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1452.396481] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1452.396481] env[69648]: ERROR nova.compute.manager [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] [ 1452.414794] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1452.415109] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1452.416942] env[69648]: INFO nova.compute.claims [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1452.423344] env[69648]: DEBUG oslo_concurrency.lockutils [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Lock "6062dd02-230d-42bc-8304-fc122f1f1489" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.378s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.424834] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "6062dd02-230d-42bc-8304-fc122f1f1489" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 278.407s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1452.424834] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] During sync_power_state the instance has a pending task (deleting). Skip. [ 1452.424834] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "6062dd02-230d-42bc-8304-fc122f1f1489" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.485633] env[69648]: INFO nova.compute.manager [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] [instance: 6062dd02-230d-42bc-8304-fc122f1f1489] Successfully reverted task state from None on failure for instance. [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server [None req-52424b7a-6b20-4847-899c-31521985f4d3 tempest-ServerShowV247Test-876702292 tempest-ServerShowV247Test-876702292-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-f05b3464-456b-46e8-8b4b-7bb7c73e1883'] [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1452.489233] env[69648]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1452.489973] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1452.490502] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1452.490963] env[69648]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.491452] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1452.491879] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1452.492309] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1452.492309] env[69648]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1452.492309] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1452.492309] env[69648]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1452.492309] env[69648]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1452.492309] env[69648]: ERROR oslo_messaging.rpc.server [ 1452.657031] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c10c3d27-e4ee-4add-937d-2291a6ebeec3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.664882] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb289dd6-fa6e-4d0e-85f5-a5098152719a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.693805] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-347b6428-da4b-4473-ae6c-c3de6ad22d86 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.700496] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac6b87b3-604f-4578-b598-f8e32d7ec836 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.714067] env[69648]: DEBUG nova.compute.provider_tree [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1452.722823] env[69648]: DEBUG nova.scheduler.client.report [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1452.736094] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.321s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.736568] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1452.768366] env[69648]: DEBUG nova.compute.utils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1452.769466] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1452.769672] env[69648]: DEBUG nova.network.neutron [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1452.779108] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1452.822934] env[69648]: DEBUG nova.policy [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cc2dddbbaacd46a2b666a91962b7b61b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5dd14788cf484723b237d19251d169b9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1452.839456] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1452.863815] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1452.864062] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1452.864225] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1452.864410] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1452.864561] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1452.864711] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1452.864913] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1452.865082] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1452.865256] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1452.865491] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1452.865666] env[69648]: DEBUG nova.virt.hardware [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1452.866545] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34821b28-f966-44f9-bedf-2f625fe628e1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.874512] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f760c1ef-0bf0-417e-9ea1-16e6030317ce {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1453.240459] env[69648]: DEBUG nova.network.neutron [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Successfully created port: d76af88a-b3db-4740-8183-ee4f56f83ff2 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1453.756708] env[69648]: DEBUG nova.compute.manager [req-70aa6339-efaa-46f6-8e43-f1431c491182 req-52a35efe-aa4e-44c8-883f-04dbeffc2eed service nova] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Received event network-vif-plugged-d76af88a-b3db-4740-8183-ee4f56f83ff2 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1453.756947] env[69648]: DEBUG oslo_concurrency.lockutils [req-70aa6339-efaa-46f6-8e43-f1431c491182 req-52a35efe-aa4e-44c8-883f-04dbeffc2eed service nova] Acquiring lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1453.757183] env[69648]: DEBUG oslo_concurrency.lockutils [req-70aa6339-efaa-46f6-8e43-f1431c491182 req-52a35efe-aa4e-44c8-883f-04dbeffc2eed service nova] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1453.757357] env[69648]: DEBUG oslo_concurrency.lockutils [req-70aa6339-efaa-46f6-8e43-f1431c491182 req-52a35efe-aa4e-44c8-883f-04dbeffc2eed service nova] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1453.757531] env[69648]: DEBUG nova.compute.manager [req-70aa6339-efaa-46f6-8e43-f1431c491182 req-52a35efe-aa4e-44c8-883f-04dbeffc2eed service nova] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] No waiting events found dispatching network-vif-plugged-d76af88a-b3db-4740-8183-ee4f56f83ff2 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1453.757697] env[69648]: WARNING nova.compute.manager [req-70aa6339-efaa-46f6-8e43-f1431c491182 req-52a35efe-aa4e-44c8-883f-04dbeffc2eed service nova] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Received unexpected event network-vif-plugged-d76af88a-b3db-4740-8183-ee4f56f83ff2 for instance with vm_state building and task_state spawning. [ 1453.830593] env[69648]: DEBUG nova.network.neutron [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Successfully updated port: d76af88a-b3db-4740-8183-ee4f56f83ff2 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1453.842265] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "refresh_cache-590dbeb2-7e21-454f-93b5-97065c5bfdb0" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1453.842417] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquired lock "refresh_cache-590dbeb2-7e21-454f-93b5-97065c5bfdb0" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1453.842611] env[69648]: DEBUG nova.network.neutron [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1453.877494] env[69648]: DEBUG nova.network.neutron [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1454.042376] env[69648]: DEBUG nova.network.neutron [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Updating instance_info_cache with network_info: [{"id": "d76af88a-b3db-4740-8183-ee4f56f83ff2", "address": "fa:16:3e:8e:4f:c7", "network": {"id": "9646bb87-1573-4497-a44e-2918d0bc9a18", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1280380194-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5dd14788cf484723b237d19251d169b9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bed837fa-6b6a-4192-a229-a99426a46065", "external-id": "nsx-vlan-transportzone-954", "segmentation_id": 954, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd76af88a-b3", "ovs_interfaceid": "d76af88a-b3db-4740-8183-ee4f56f83ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1454.054018] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Releasing lock "refresh_cache-590dbeb2-7e21-454f-93b5-97065c5bfdb0" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1454.054018] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Instance network_info: |[{"id": "d76af88a-b3db-4740-8183-ee4f56f83ff2", "address": "fa:16:3e:8e:4f:c7", "network": {"id": "9646bb87-1573-4497-a44e-2918d0bc9a18", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1280380194-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5dd14788cf484723b237d19251d169b9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bed837fa-6b6a-4192-a229-a99426a46065", "external-id": "nsx-vlan-transportzone-954", "segmentation_id": 954, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd76af88a-b3", "ovs_interfaceid": "d76af88a-b3db-4740-8183-ee4f56f83ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1454.054322] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8e:4f:c7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bed837fa-6b6a-4192-a229-a99426a46065', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd76af88a-b3db-4740-8183-ee4f56f83ff2', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1454.061769] env[69648]: DEBUG oslo.service.loopingcall [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1454.062200] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1454.062417] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-928819cf-9722-46eb-a2ef-15c3c697d461 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1454.081691] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1454.081691] env[69648]: value = "task-3466594" [ 1454.081691] env[69648]: _type = "Task" [ 1454.081691] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1454.089088] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466594, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1454.592055] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466594, 'name': CreateVM_Task, 'duration_secs': 0.300526} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1454.592253] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1454.592951] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1454.593113] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1454.593503] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1454.593758] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-53d2b5d4-dce7-46ca-ba5d-511df1c0f5c0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1454.598919] env[69648]: DEBUG oslo_vmware.api [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for the task: (returnval){ [ 1454.598919] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52028e42-fbd6-9eea-aeb7-a824f8bccf25" [ 1454.598919] env[69648]: _type = "Task" [ 1454.598919] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1454.608150] env[69648]: DEBUG oslo_vmware.api [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52028e42-fbd6-9eea-aeb7-a824f8bccf25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1455.109864] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1455.110221] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1455.110338] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1455.784924] env[69648]: DEBUG nova.compute.manager [req-5bdb67b3-3d4a-498b-9768-2b29cf9555bf req-08526f40-7f2e-4bfa-a710-6a9dfc98f04f service nova] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Received event network-changed-d76af88a-b3db-4740-8183-ee4f56f83ff2 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1455.785162] env[69648]: DEBUG nova.compute.manager [req-5bdb67b3-3d4a-498b-9768-2b29cf9555bf req-08526f40-7f2e-4bfa-a710-6a9dfc98f04f service nova] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Refreshing instance network info cache due to event network-changed-d76af88a-b3db-4740-8183-ee4f56f83ff2. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1455.785424] env[69648]: DEBUG oslo_concurrency.lockutils [req-5bdb67b3-3d4a-498b-9768-2b29cf9555bf req-08526f40-7f2e-4bfa-a710-6a9dfc98f04f service nova] Acquiring lock "refresh_cache-590dbeb2-7e21-454f-93b5-97065c5bfdb0" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1455.785586] env[69648]: DEBUG oslo_concurrency.lockutils [req-5bdb67b3-3d4a-498b-9768-2b29cf9555bf req-08526f40-7f2e-4bfa-a710-6a9dfc98f04f service nova] Acquired lock "refresh_cache-590dbeb2-7e21-454f-93b5-97065c5bfdb0" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1455.785759] env[69648]: DEBUG nova.network.neutron [req-5bdb67b3-3d4a-498b-9768-2b29cf9555bf req-08526f40-7f2e-4bfa-a710-6a9dfc98f04f service nova] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Refreshing network info cache for port d76af88a-b3db-4740-8183-ee4f56f83ff2 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1456.065301] env[69648]: DEBUG nova.network.neutron [req-5bdb67b3-3d4a-498b-9768-2b29cf9555bf req-08526f40-7f2e-4bfa-a710-6a9dfc98f04f service nova] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Updated VIF entry in instance network info cache for port d76af88a-b3db-4740-8183-ee4f56f83ff2. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1456.065740] env[69648]: DEBUG nova.network.neutron [req-5bdb67b3-3d4a-498b-9768-2b29cf9555bf req-08526f40-7f2e-4bfa-a710-6a9dfc98f04f service nova] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Updating instance_info_cache with network_info: [{"id": "d76af88a-b3db-4740-8183-ee4f56f83ff2", "address": "fa:16:3e:8e:4f:c7", "network": {"id": "9646bb87-1573-4497-a44e-2918d0bc9a18", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1280380194-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5dd14788cf484723b237d19251d169b9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bed837fa-6b6a-4192-a229-a99426a46065", "external-id": "nsx-vlan-transportzone-954", "segmentation_id": 954, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd76af88a-b3", "ovs_interfaceid": "d76af88a-b3db-4740-8183-ee4f56f83ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1456.075633] env[69648]: DEBUG oslo_concurrency.lockutils [req-5bdb67b3-3d4a-498b-9768-2b29cf9555bf req-08526f40-7f2e-4bfa-a710-6a9dfc98f04f service nova] Releasing lock "refresh_cache-590dbeb2-7e21-454f-93b5-97065c5bfdb0" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1491.085912] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1493.064448] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1494.065710] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1495.065568] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1496.065423] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1499.012030] env[69648]: WARNING oslo_vmware.rw_handles [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1499.012030] env[69648]: ERROR oslo_vmware.rw_handles [ 1499.012543] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/d5c3e5d0-7368-4ceb-95c1-e0b7f764ddbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1499.015028] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1499.015117] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Copying Virtual Disk [datastore1] vmware_temp/d5c3e5d0-7368-4ceb-95c1-e0b7f764ddbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/d5c3e5d0-7368-4ceb-95c1-e0b7f764ddbe/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1499.015415] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f959e46d-286f-47d1-aa25-0c570fbc6639 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.023616] env[69648]: DEBUG oslo_vmware.api [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Waiting for the task: (returnval){ [ 1499.023616] env[69648]: value = "task-3466595" [ 1499.023616] env[69648]: _type = "Task" [ 1499.023616] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1499.031973] env[69648]: DEBUG oslo_vmware.api [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Task: {'id': task-3466595, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1499.059560] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1499.533860] env[69648]: DEBUG oslo_vmware.exceptions [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1499.534177] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1499.534725] env[69648]: ERROR nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1499.534725] env[69648]: Faults: ['InvalidArgument'] [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] Traceback (most recent call last): [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] yield resources [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self.driver.spawn(context, instance, image_meta, [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self._fetch_image_if_missing(context, vi) [ 1499.534725] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] image_cache(vi, tmp_image_ds_loc) [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] vm_util.copy_virtual_disk( [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] session._wait_for_task(vmdk_copy_task) [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return self.wait_for_task(task_ref) [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return evt.wait() [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] result = hub.switch() [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1499.535120] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return self.greenlet.switch() [ 1499.535504] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1499.535504] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self.f(*self.args, **self.kw) [ 1499.535504] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1499.535504] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] raise exceptions.translate_fault(task_info.error) [ 1499.535504] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1499.535504] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] Faults: ['InvalidArgument'] [ 1499.535504] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] [ 1499.535504] env[69648]: INFO nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Terminating instance [ 1499.536646] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1499.536881] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1499.537138] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-61497d15-9973-4412-94fb-e777bc32d65d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.539170] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "refresh_cache-e64fd474-91ab-449e-8785-e788685ed77a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1499.539332] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquired lock "refresh_cache-e64fd474-91ab-449e-8785-e788685ed77a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1499.539499] env[69648]: DEBUG nova.network.neutron [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1499.546319] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1499.546495] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1499.547672] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a4e79a2c-2a1b-417e-8e81-38a91913dad4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.554754] env[69648]: DEBUG oslo_vmware.api [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for the task: (returnval){ [ 1499.554754] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c7272d-9595-c75c-3f4d-4b09295b0463" [ 1499.554754] env[69648]: _type = "Task" [ 1499.554754] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1499.563118] env[69648]: DEBUG oslo_vmware.api [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c7272d-9595-c75c-3f4d-4b09295b0463, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1499.567759] env[69648]: DEBUG nova.network.neutron [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1499.627124] env[69648]: DEBUG nova.network.neutron [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1499.636025] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Releasing lock "refresh_cache-e64fd474-91ab-449e-8785-e788685ed77a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1499.636514] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1499.636772] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1499.637855] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba31d052-384b-4293-9091-2a8b6f8bb626 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.645591] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1499.645824] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e835ba1c-1934-42b5-87cd-7ff583f5d972 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.676688] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1499.676923] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1499.677140] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Deleting the datastore file [datastore1] e64fd474-91ab-449e-8785-e788685ed77a {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1499.677389] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e06498ef-f8c2-495b-9460-ee9c810288e8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.682958] env[69648]: DEBUG oslo_vmware.api [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Waiting for the task: (returnval){ [ 1499.682958] env[69648]: value = "task-3466597" [ 1499.682958] env[69648]: _type = "Task" [ 1499.682958] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1499.690278] env[69648]: DEBUG oslo_vmware.api [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Task: {'id': task-3466597, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1500.065809] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1500.066076] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1500.066437] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1500.066664] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Creating directory with path [datastore1] vmware_temp/ffb3febf-ac7f-490a-beff-5391fe7ef189/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1500.066916] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d31e79d1-7ce4-4cc8-9ef6-91ab20e5d57a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.079899] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Created directory with path [datastore1] vmware_temp/ffb3febf-ac7f-490a-beff-5391fe7ef189/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1500.079899] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Fetch image to [datastore1] vmware_temp/ffb3febf-ac7f-490a-beff-5391fe7ef189/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1500.080106] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/ffb3febf-ac7f-490a-beff-5391fe7ef189/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1500.081076] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c5a444d-0e35-4889-a8bd-19ba24f8afdb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.087583] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b1a8a62-72ac-455a-91e0-7a753de12dac {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.097817] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9081ac22-a2f5-48b9-b910-aedd6a3005ef {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.128264] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5d249a2-74ac-473e-bf2d-0235de8645ed {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.134159] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5769d6e3-3a75-450f-b007-3bb767ee445d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.159179] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1500.195999] env[69648]: DEBUG oslo_vmware.api [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Task: {'id': task-3466597, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.04279} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1500.197795] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1500.197983] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1500.198188] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1500.198365] env[69648]: INFO nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1500.198595] env[69648]: DEBUG oslo.service.loopingcall [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1500.198984] env[69648]: DEBUG nova.compute.manager [-] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1500.201116] env[69648]: DEBUG nova.compute.claims [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1500.201289] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1500.201498] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.211111] env[69648]: DEBUG oslo_vmware.rw_handles [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ffb3febf-ac7f-490a-beff-5391fe7ef189/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1500.270052] env[69648]: DEBUG oslo_vmware.rw_handles [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1500.270254] env[69648]: DEBUG oslo_vmware.rw_handles [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ffb3febf-ac7f-490a-beff-5391fe7ef189/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1500.441165] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e6152f7-58ac-4dca-882d-fd0d68add012 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.448799] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-999de514-37b1-4c44-8b83-75b7269b8ff0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.477843] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05488d42-2445-443f-ae25-08dfc8b96faf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.484567] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b610b06b-a49d-401e-b40b-46f832f5d306 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.497215] env[69648]: DEBUG nova.compute.provider_tree [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1500.505643] env[69648]: DEBUG nova.scheduler.client.report [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1500.522460] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.321s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.522984] env[69648]: ERROR nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1500.522984] env[69648]: Faults: ['InvalidArgument'] [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] Traceback (most recent call last): [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self.driver.spawn(context, instance, image_meta, [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self._fetch_image_if_missing(context, vi) [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] image_cache(vi, tmp_image_ds_loc) [ 1500.522984] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] vm_util.copy_virtual_disk( [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] session._wait_for_task(vmdk_copy_task) [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return self.wait_for_task(task_ref) [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return evt.wait() [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] result = hub.switch() [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return self.greenlet.switch() [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1500.523406] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self.f(*self.args, **self.kw) [ 1500.523783] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1500.523783] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] raise exceptions.translate_fault(task_info.error) [ 1500.523783] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1500.523783] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] Faults: ['InvalidArgument'] [ 1500.523783] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] [ 1500.523783] env[69648]: DEBUG nova.compute.utils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1500.525191] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Build of instance e64fd474-91ab-449e-8785-e788685ed77a was re-scheduled: A specified parameter was not correct: fileType [ 1500.525191] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1500.525575] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1500.525822] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "refresh_cache-e64fd474-91ab-449e-8785-e788685ed77a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1500.525978] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquired lock "refresh_cache-e64fd474-91ab-449e-8785-e788685ed77a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1500.526156] env[69648]: DEBUG nova.network.neutron [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1500.549370] env[69648]: DEBUG nova.network.neutron [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1500.604774] env[69648]: DEBUG nova.network.neutron [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1500.613073] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Releasing lock "refresh_cache-e64fd474-91ab-449e-8785-e788685ed77a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1500.613302] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1500.613487] env[69648]: DEBUG nova.compute.manager [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1500.695905] env[69648]: INFO nova.scheduler.client.report [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Deleted allocations for instance e64fd474-91ab-449e-8785-e788685ed77a [ 1500.717992] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d4a92724-048d-4db9-9a83-2846bfe09e28 tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "e64fd474-91ab-449e-8785-e788685ed77a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 605.382s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.719140] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "e64fd474-91ab-449e-8785-e788685ed77a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 409.972s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.719366] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "e64fd474-91ab-449e-8785-e788685ed77a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1500.719568] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "e64fd474-91ab-449e-8785-e788685ed77a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.719740] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "e64fd474-91ab-449e-8785-e788685ed77a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.721706] env[69648]: INFO nova.compute.manager [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Terminating instance [ 1500.723223] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquiring lock "refresh_cache-e64fd474-91ab-449e-8785-e788685ed77a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1500.723380] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Acquired lock "refresh_cache-e64fd474-91ab-449e-8785-e788685ed77a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1500.723612] env[69648]: DEBUG nova.network.neutron [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1500.737716] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1500.747670] env[69648]: DEBUG nova.network.neutron [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1500.783561] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1500.783841] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.785378] env[69648]: INFO nova.compute.claims [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1500.810678] env[69648]: DEBUG nova.network.neutron [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1500.819399] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Releasing lock "refresh_cache-e64fd474-91ab-449e-8785-e788685ed77a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1500.819797] env[69648]: DEBUG nova.compute.manager [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1500.819994] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1500.822219] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0113e3ac-3b76-4b70-b50b-8dcffa8cacbf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.833780] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea6b3a67-5c2e-4ebc-9ea6-5d8589aaeaae {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.863819] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e64fd474-91ab-449e-8785-e788685ed77a could not be found. [ 1500.864044] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1500.864234] env[69648]: INFO nova.compute.manager [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1500.864482] env[69648]: DEBUG oslo.service.loopingcall [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1500.866826] env[69648]: DEBUG nova.compute.manager [-] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1500.866933] env[69648]: DEBUG nova.network.neutron [-] [instance: e64fd474-91ab-449e-8785-e788685ed77a] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1500.984258] env[69648]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=69648) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1500.984800] env[69648]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-80e229bf-3eff-4eee-8744-923322c9a86f'] [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1500.985352] env[69648]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1500.985831] env[69648]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1500.986279] env[69648]: ERROR oslo.service.loopingcall [ 1500.986641] env[69648]: ERROR nova.compute.manager [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1501.021008] env[69648]: ERROR nova.compute.manager [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] Traceback (most recent call last): [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] ret = obj(*args, **kwargs) [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] exception_handler_v20(status_code, error_body) [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] raise client_exc(message=error_message, [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] Neutron server returns request_ids: ['req-80e229bf-3eff-4eee-8744-923322c9a86f'] [ 1501.021008] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] During handling of the above exception, another exception occurred: [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] Traceback (most recent call last): [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self._delete_instance(context, instance, bdms) [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self._shutdown_instance(context, instance, bdms) [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self._try_deallocate_network(context, instance, requested_networks) [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] with excutils.save_and_reraise_exception(): [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1501.021414] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self.force_reraise() [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] raise self.value [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] _deallocate_network_with_retries() [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return evt.wait() [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] result = hub.switch() [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return self.greenlet.switch() [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1501.021752] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] result = func(*self.args, **self.kw) [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] result = f(*args, **kwargs) [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self._deallocate_network( [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self.network_api.deallocate_for_instance( [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] data = neutron.list_ports(**search_opts) [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] ret = obj(*args, **kwargs) [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return self.list('ports', self.ports_path, retrieve_all, [ 1501.022092] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] ret = obj(*args, **kwargs) [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] for r in self._pagination(collection, path, **params): [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] res = self.get(path, params=params) [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] ret = obj(*args, **kwargs) [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return self.retry_request("GET", action, body=body, [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] ret = obj(*args, **kwargs) [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1501.022415] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] return self.do_request(method, action, body=body, [ 1501.022738] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.022738] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] ret = obj(*args, **kwargs) [ 1501.022738] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1501.022738] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] self._handle_fault_response(status_code, replybody, resp) [ 1501.022738] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1501.022738] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1501.022738] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1501.022738] env[69648]: ERROR nova.compute.manager [instance: e64fd474-91ab-449e-8785-e788685ed77a] [ 1501.024952] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ac87989-baa6-40bb-b79e-bfa988d6270f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.033299] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ba1ca0e-a6b5-40ce-8eb1-d986c1778984 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.064499] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Lock "e64fd474-91ab-449e-8785-e788685ed77a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.345s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1501.066012] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa10340c-1443-40b4-abeb-fa43e47e48d0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.068778] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1501.069737] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "e64fd474-91ab-449e-8785-e788685ed77a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 327.052s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1501.069827] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: e64fd474-91ab-449e-8785-e788685ed77a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1501.070019] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "e64fd474-91ab-449e-8785-e788685ed77a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1501.076215] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c202dbd1-5eff-4e4a-a4e0-e6fa4ec91a57 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.081581] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1501.090284] env[69648]: DEBUG nova.compute.provider_tree [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1501.096876] env[69648]: DEBUG nova.scheduler.client.report [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1501.108905] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.325s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1501.109406] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1501.111909] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.030s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1501.112115] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1501.112312] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1501.113500] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52bb42e3-cc47-4227-889f-e42fb28d0487 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.122448] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3a7c45a-2584-47b6-85e5-c3805533405b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.128401] env[69648]: INFO nova.compute.manager [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] [instance: e64fd474-91ab-449e-8785-e788685ed77a] Successfully reverted task state from None on failure for instance. [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server [None req-7741208a-4ffa-4766-bd70-6f62911cdd8c tempest-ServersAaction247Test-1393928757 tempest-ServersAaction247Test-1393928757-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-80e229bf-3eff-4eee-8744-923322c9a86f'] [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1501.139243] env[69648]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1501.139770] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1501.140378] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1501.140775] env[69648]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.141281] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1501.141781] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1501.142288] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1501.142288] env[69648]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1501.142288] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1501.142288] env[69648]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1501.142288] env[69648]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1501.142288] env[69648]: ERROR oslo_messaging.rpc.server [ 1501.142288] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2704176-dcc8-46cd-9694-2ea928d4b029 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.146943] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e2cc732-e4a2-4dd2-b507-fea6a3a02e8a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.151561] env[69648]: DEBUG nova.compute.utils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1501.153112] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1501.153271] env[69648]: DEBUG nova.network.neutron [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1501.185209] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180945MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1501.185209] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1501.185209] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1501.185209] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1501.254980] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1501.258679] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.258794] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.258924] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ab839f84-b864-409e-883d-00dddb5db3db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.259060] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.259183] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.259300] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.259417] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 58804be5-ee46-4b25-be84-890d5cd1607f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.259530] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.259640] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.259755] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1501.270715] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1501.281378] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1501.284797] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1501.285028] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1501.285195] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1501.285379] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1501.285527] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1501.285675] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1501.285914] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1501.286101] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1501.286301] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1501.286481] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1501.286658] env[69648]: DEBUG nova.virt.hardware [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1501.287827] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cb4972b-6d07-4039-8770-7e34a9a70a1d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.294139] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1501.300150] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa310bc8-f195-4c40-b128-adbbd97528f9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.305268] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 97ce6d4b-ad90-47c7-885a-1f6632c8b97d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1501.305494] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1501.305641] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1501.465677] env[69648]: DEBUG nova.policy [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0a1e78d39d744d39b01da61d52a96c36', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '73994a87306e4ce088729c3bb5476f3e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1501.485497] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a051541a-1bec-4acd-9d8b-ce1afd22b234 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.493422] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42531445-ca90-4a6f-8a60-e1f86a9bb7cc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.525587] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-016896e4-ea4c-4e49-a8d2-ffffcfe6d832 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.533746] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4991e270-f4c5-42c5-8eda-f76b31afccbc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.546608] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1501.554512] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1501.568626] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1501.568810] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.386s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1501.781707] env[69648]: DEBUG nova.network.neutron [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Successfully created port: b06d1368-82ed-41df-8011-33bb4073776c {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1502.308701] env[69648]: DEBUG nova.compute.manager [req-1c4a8cc7-1d6a-42e8-a7d4-3f30fe7925fc req-fdfea9db-aabe-4125-bf89-fc95dd3a0d07 service nova] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Received event network-vif-plugged-b06d1368-82ed-41df-8011-33bb4073776c {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1502.308945] env[69648]: DEBUG oslo_concurrency.lockutils [req-1c4a8cc7-1d6a-42e8-a7d4-3f30fe7925fc req-fdfea9db-aabe-4125-bf89-fc95dd3a0d07 service nova] Acquiring lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1502.309167] env[69648]: DEBUG oslo_concurrency.lockutils [req-1c4a8cc7-1d6a-42e8-a7d4-3f30fe7925fc req-fdfea9db-aabe-4125-bf89-fc95dd3a0d07 service nova] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1502.309343] env[69648]: DEBUG oslo_concurrency.lockutils [req-1c4a8cc7-1d6a-42e8-a7d4-3f30fe7925fc req-fdfea9db-aabe-4125-bf89-fc95dd3a0d07 service nova] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1502.309514] env[69648]: DEBUG nova.compute.manager [req-1c4a8cc7-1d6a-42e8-a7d4-3f30fe7925fc req-fdfea9db-aabe-4125-bf89-fc95dd3a0d07 service nova] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] No waiting events found dispatching network-vif-plugged-b06d1368-82ed-41df-8011-33bb4073776c {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1502.309681] env[69648]: WARNING nova.compute.manager [req-1c4a8cc7-1d6a-42e8-a7d4-3f30fe7925fc req-fdfea9db-aabe-4125-bf89-fc95dd3a0d07 service nova] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Received unexpected event network-vif-plugged-b06d1368-82ed-41df-8011-33bb4073776c for instance with vm_state building and task_state spawning. [ 1502.434136] env[69648]: DEBUG nova.network.neutron [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Successfully updated port: b06d1368-82ed-41df-8011-33bb4073776c {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1502.448813] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "refresh_cache-3dc3db1c-43c0-45e9-8283-38e77f66f06f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1502.448901] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquired lock "refresh_cache-3dc3db1c-43c0-45e9-8283-38e77f66f06f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1502.449067] env[69648]: DEBUG nova.network.neutron [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1502.488031] env[69648]: DEBUG nova.network.neutron [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1502.565417] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1502.691795] env[69648]: DEBUG nova.network.neutron [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Updating instance_info_cache with network_info: [{"id": "b06d1368-82ed-41df-8011-33bb4073776c", "address": "fa:16:3e:43:71:48", "network": {"id": "0ff6cf3f-cb54-4cbd-b96e-0612988b03df", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1838163903-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "73994a87306e4ce088729c3bb5476f3e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb06d1368-82", "ovs_interfaceid": "b06d1368-82ed-41df-8011-33bb4073776c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1502.704216] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Releasing lock "refresh_cache-3dc3db1c-43c0-45e9-8283-38e77f66f06f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1502.704515] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Instance network_info: |[{"id": "b06d1368-82ed-41df-8011-33bb4073776c", "address": "fa:16:3e:43:71:48", "network": {"id": "0ff6cf3f-cb54-4cbd-b96e-0612988b03df", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1838163903-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "73994a87306e4ce088729c3bb5476f3e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb06d1368-82", "ovs_interfaceid": "b06d1368-82ed-41df-8011-33bb4073776c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1502.704915] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:43:71:48', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '113aa98d-90ca-43bc-a534-8908d1ec7d15', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b06d1368-82ed-41df-8011-33bb4073776c', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1502.712517] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Creating folder: Project (73994a87306e4ce088729c3bb5476f3e). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1502.713031] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2c1093ed-3a6c-4ef1-9cd3-712ad23f8058 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.722867] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Created folder: Project (73994a87306e4ce088729c3bb5476f3e) in parent group-v692308. [ 1502.723063] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Creating folder: Instances. Parent ref: group-v692388. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1502.723276] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d2f637ca-0485-4eec-9403-4dd65448dc05 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.731571] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Created folder: Instances in parent group-v692388. [ 1502.731856] env[69648]: DEBUG oslo.service.loopingcall [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1502.731992] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1502.732196] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-df6812a4-00bd-42eb-a438-eb52b31cfad4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.750584] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1502.750584] env[69648]: value = "task-3466600" [ 1502.750584] env[69648]: _type = "Task" [ 1502.750584] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1502.757800] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466600, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1503.260369] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466600, 'name': CreateVM_Task, 'duration_secs': 0.307901} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1503.260611] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1503.261311] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1503.261479] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1503.261787] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1503.262039] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ce90bb19-b11e-4b86-bcfd-8ec3f868e275 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.266428] env[69648]: DEBUG oslo_vmware.api [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for the task: (returnval){ [ 1503.266428] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5299448b-62b3-0661-7252-5fe3344ea945" [ 1503.266428] env[69648]: _type = "Task" [ 1503.266428] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1503.273644] env[69648]: DEBUG oslo_vmware.api [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5299448b-62b3-0661-7252-5fe3344ea945, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1503.776973] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1503.777317] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1503.777475] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1504.065435] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1504.065623] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1504.065736] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1504.087213] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.087394] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.087545] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.087678] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.087806] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.087935] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.088070] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.088196] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.088316] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.088435] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1504.088559] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1504.338166] env[69648]: DEBUG nova.compute.manager [req-6366a508-8085-447b-b237-63d85b7c6c86 req-1a0ff622-96df-4053-85e4-f31a4585436e service nova] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Received event network-changed-b06d1368-82ed-41df-8011-33bb4073776c {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1504.338374] env[69648]: DEBUG nova.compute.manager [req-6366a508-8085-447b-b237-63d85b7c6c86 req-1a0ff622-96df-4053-85e4-f31a4585436e service nova] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Refreshing instance network info cache due to event network-changed-b06d1368-82ed-41df-8011-33bb4073776c. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1504.338587] env[69648]: DEBUG oslo_concurrency.lockutils [req-6366a508-8085-447b-b237-63d85b7c6c86 req-1a0ff622-96df-4053-85e4-f31a4585436e service nova] Acquiring lock "refresh_cache-3dc3db1c-43c0-45e9-8283-38e77f66f06f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1504.338731] env[69648]: DEBUG oslo_concurrency.lockutils [req-6366a508-8085-447b-b237-63d85b7c6c86 req-1a0ff622-96df-4053-85e4-f31a4585436e service nova] Acquired lock "refresh_cache-3dc3db1c-43c0-45e9-8283-38e77f66f06f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1504.338940] env[69648]: DEBUG nova.network.neutron [req-6366a508-8085-447b-b237-63d85b7c6c86 req-1a0ff622-96df-4053-85e4-f31a4585436e service nova] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Refreshing network info cache for port b06d1368-82ed-41df-8011-33bb4073776c {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1504.608714] env[69648]: DEBUG nova.network.neutron [req-6366a508-8085-447b-b237-63d85b7c6c86 req-1a0ff622-96df-4053-85e4-f31a4585436e service nova] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Updated VIF entry in instance network info cache for port b06d1368-82ed-41df-8011-33bb4073776c. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1504.609089] env[69648]: DEBUG nova.network.neutron [req-6366a508-8085-447b-b237-63d85b7c6c86 req-1a0ff622-96df-4053-85e4-f31a4585436e service nova] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Updating instance_info_cache with network_info: [{"id": "b06d1368-82ed-41df-8011-33bb4073776c", "address": "fa:16:3e:43:71:48", "network": {"id": "0ff6cf3f-cb54-4cbd-b96e-0612988b03df", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1838163903-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "73994a87306e4ce088729c3bb5476f3e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb06d1368-82", "ovs_interfaceid": "b06d1368-82ed-41df-8011-33bb4073776c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1504.618667] env[69648]: DEBUG oslo_concurrency.lockutils [req-6366a508-8085-447b-b237-63d85b7c6c86 req-1a0ff622-96df-4053-85e4-f31a4585436e service nova] Releasing lock "refresh_cache-3dc3db1c-43c0-45e9-8283-38e77f66f06f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1517.222870] env[69648]: DEBUG oslo_concurrency.lockutils [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1520.247729] env[69648]: DEBUG oslo_concurrency.lockutils [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1545.819183] env[69648]: WARNING oslo_vmware.rw_handles [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1545.819183] env[69648]: ERROR oslo_vmware.rw_handles [ 1545.820197] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/ffb3febf-ac7f-490a-beff-5391fe7ef189/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1545.821480] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1545.821765] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Copying Virtual Disk [datastore1] vmware_temp/ffb3febf-ac7f-490a-beff-5391fe7ef189/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/ffb3febf-ac7f-490a-beff-5391fe7ef189/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1545.822049] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-aff32141-86ed-4285-9205-f80de27c477f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.831445] env[69648]: DEBUG oslo_vmware.api [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for the task: (returnval){ [ 1545.831445] env[69648]: value = "task-3466601" [ 1545.831445] env[69648]: _type = "Task" [ 1545.831445] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1545.839203] env[69648]: DEBUG oslo_vmware.api [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': task-3466601, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1546.342387] env[69648]: DEBUG oslo_vmware.exceptions [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1546.342666] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1546.343247] env[69648]: ERROR nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1546.343247] env[69648]: Faults: ['InvalidArgument'] [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Traceback (most recent call last): [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] yield resources [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] self.driver.spawn(context, instance, image_meta, [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] self._fetch_image_if_missing(context, vi) [ 1546.343247] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] image_cache(vi, tmp_image_ds_loc) [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] vm_util.copy_virtual_disk( [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] session._wait_for_task(vmdk_copy_task) [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] return self.wait_for_task(task_ref) [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] return evt.wait() [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] result = hub.switch() [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1546.343563] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] return self.greenlet.switch() [ 1546.343960] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1546.343960] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] self.f(*self.args, **self.kw) [ 1546.343960] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1546.343960] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] raise exceptions.translate_fault(task_info.error) [ 1546.343960] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1546.343960] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Faults: ['InvalidArgument'] [ 1546.343960] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] [ 1546.343960] env[69648]: INFO nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Terminating instance [ 1546.345165] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1546.345410] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1546.345654] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2a2bff9a-37cb-4130-8b69-14f889457df1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.348424] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1546.348542] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1546.350092] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c89fe00-9cfb-474a-bc17-b2fcf2fce2e2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.354574] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1546.354645] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1546.357225] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6728fac0-49f5-42f3-aa73-099b1332244f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.359296] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1546.359512] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-660139d8-2ef4-439e-b385-32ee654a66e7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.363525] env[69648]: DEBUG oslo_vmware.api [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Waiting for the task: (returnval){ [ 1546.363525] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]527a9ff7-4c31-184f-ef63-cbd74b7798cc" [ 1546.363525] env[69648]: _type = "Task" [ 1546.363525] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1546.372034] env[69648]: DEBUG oslo_vmware.api [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]527a9ff7-4c31-184f-ef63-cbd74b7798cc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1546.420511] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1546.420728] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1546.420912] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Deleting the datastore file [datastore1] fc2f697a-9f8c-4de1-a9a9-8606118663d7 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1546.421197] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d2f6d77a-0580-4a5a-8ade-564f8548cef3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.427469] env[69648]: DEBUG oslo_vmware.api [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for the task: (returnval){ [ 1546.427469] env[69648]: value = "task-3466603" [ 1546.427469] env[69648]: _type = "Task" [ 1546.427469] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1546.435579] env[69648]: DEBUG oslo_vmware.api [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': task-3466603, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1546.873243] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1546.873512] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Creating directory with path [datastore1] vmware_temp/edba2ada-6039-495b-a54c-42f1df622997/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1546.873751] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d3660cfb-38ec-42ec-9a4e-5140b20bea1c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.885074] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Created directory with path [datastore1] vmware_temp/edba2ada-6039-495b-a54c-42f1df622997/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1546.885268] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Fetch image to [datastore1] vmware_temp/edba2ada-6039-495b-a54c-42f1df622997/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1546.885439] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/edba2ada-6039-495b-a54c-42f1df622997/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1546.886208] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e1e93bd-7af9-4a1d-8cc8-343400aa7cfa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.892496] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99780099-c3d1-4e6c-8ba2-b76b400c9de6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.901194] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d498d68f-612b-4897-a135-58f9dcc7dfd3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.933836] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec644182-724f-4a87-bb1b-27535d135056 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.940577] env[69648]: DEBUG oslo_vmware.api [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': task-3466603, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076124} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1546.941927] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1546.942134] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1546.942311] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1546.942487] env[69648]: INFO nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1546.944228] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7451618b-5f4e-4a34-9aea-8a867b61800c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.946058] env[69648]: DEBUG nova.compute.claims [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1546.946265] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1546.946489] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1546.966880] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1547.021277] env[69648]: DEBUG oslo_vmware.rw_handles [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/edba2ada-6039-495b-a54c-42f1df622997/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1547.082103] env[69648]: DEBUG oslo_vmware.rw_handles [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1547.082351] env[69648]: DEBUG oslo_vmware.rw_handles [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/edba2ada-6039-495b-a54c-42f1df622997/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1547.208801] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da7e0b07-f711-457a-802a-8fb9cd8b0e4b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.216331] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4356416-6a88-403e-ba59-bd85cf9fc7b2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.246491] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4c225e9-a9c3-4a38-874b-9d2b1f53321c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.253034] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06a5422f-8197-44aa-baa2-dfb2b398ee6b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.265479] env[69648]: DEBUG nova.compute.provider_tree [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1547.273974] env[69648]: DEBUG nova.scheduler.client.report [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1547.288188] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.342s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1547.288754] env[69648]: ERROR nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1547.288754] env[69648]: Faults: ['InvalidArgument'] [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Traceback (most recent call last): [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] self.driver.spawn(context, instance, image_meta, [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] self._fetch_image_if_missing(context, vi) [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] image_cache(vi, tmp_image_ds_loc) [ 1547.288754] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] vm_util.copy_virtual_disk( [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] session._wait_for_task(vmdk_copy_task) [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] return self.wait_for_task(task_ref) [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] return evt.wait() [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] result = hub.switch() [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] return self.greenlet.switch() [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1547.289132] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] self.f(*self.args, **self.kw) [ 1547.289536] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1547.289536] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] raise exceptions.translate_fault(task_info.error) [ 1547.289536] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1547.289536] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Faults: ['InvalidArgument'] [ 1547.289536] env[69648]: ERROR nova.compute.manager [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] [ 1547.289536] env[69648]: DEBUG nova.compute.utils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1547.290949] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Build of instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 was re-scheduled: A specified parameter was not correct: fileType [ 1547.290949] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1547.291331] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1547.291508] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1547.291675] env[69648]: DEBUG nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1547.291837] env[69648]: DEBUG nova.network.neutron [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1547.662347] env[69648]: DEBUG nova.network.neutron [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1547.675597] env[69648]: INFO nova.compute.manager [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Took 0.38 seconds to deallocate network for instance. [ 1547.769305] env[69648]: INFO nova.scheduler.client.report [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Deleted allocations for instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 [ 1547.793034] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e4b89185-a929-4597-9ff8-95a16aec58bc tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 620.355s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1547.795105] env[69648]: DEBUG oslo_concurrency.lockutils [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.427s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1547.795105] env[69648]: DEBUG oslo_concurrency.lockutils [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1547.795105] env[69648]: DEBUG oslo_concurrency.lockutils [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1547.795454] env[69648]: DEBUG oslo_concurrency.lockutils [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1547.797291] env[69648]: INFO nova.compute.manager [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Terminating instance [ 1547.799309] env[69648]: DEBUG nova.compute.manager [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1547.799480] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1547.800158] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5df12db1-860e-40cc-be24-2f24ee34fecb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.811493] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-924d7cfa-e268-41cc-b417-82824f06f63f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.822761] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1547.845046] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fc2f697a-9f8c-4de1-a9a9-8606118663d7 could not be found. [ 1547.845235] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1547.845416] env[69648]: INFO nova.compute.manager [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1547.845656] env[69648]: DEBUG oslo.service.loopingcall [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1547.845892] env[69648]: DEBUG nova.compute.manager [-] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1547.845998] env[69648]: DEBUG nova.network.neutron [-] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1547.874538] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1547.874784] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1547.876279] env[69648]: INFO nova.compute.claims [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1547.881925] env[69648]: DEBUG nova.network.neutron [-] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1547.892663] env[69648]: INFO nova.compute.manager [-] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] Took 0.05 seconds to deallocate network for instance. [ 1547.992907] env[69648]: DEBUG oslo_concurrency.lockutils [None req-8b89ae5e-e78d-4576-86e1-9f34eba35eb2 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1547.993855] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 373.976s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1547.994075] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: fc2f697a-9f8c-4de1-a9a9-8606118663d7] During sync_power_state the instance has a pending task (deleting). Skip. [ 1547.994359] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "fc2f697a-9f8c-4de1-a9a9-8606118663d7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1548.108021] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aaea6d7-fee3-4562-9eda-0215b3f20912 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.115468] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1900697d-c433-41d4-86a5-5c9e23a09285 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.145500] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17daeb9c-defb-4e1d-9be0-1e49368e96b1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.152341] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad2bc0d7-6bff-4a5d-a4e5-2e34e8dcedee {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.164994] env[69648]: DEBUG nova.compute.provider_tree [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1548.174179] env[69648]: DEBUG nova.scheduler.client.report [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1548.187982] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1548.188470] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1548.225950] env[69648]: DEBUG nova.compute.utils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1548.227308] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1548.227483] env[69648]: DEBUG nova.network.neutron [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1548.235246] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1548.281411] env[69648]: DEBUG nova.policy [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'caf89555b5df4f5fa4cac41f6b1792db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca41677808a749f1b88e43a112db7fb2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1548.294898] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1548.321015] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1548.321290] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1548.321455] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1548.321636] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1548.321783] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1548.321930] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1548.322153] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1548.322320] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1548.322488] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1548.322648] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1548.322819] env[69648]: DEBUG nova.virt.hardware [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1548.323700] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-435a14ed-b0c8-4b99-9443-4f984c1e4e36 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.332253] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27d61dd4-400a-467f-b4ba-7f0d0c04fadc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.650997] env[69648]: DEBUG nova.network.neutron [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Successfully created port: 43f04108-bc0f-4bab-b088-501559d95c13 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1549.251939] env[69648]: DEBUG nova.network.neutron [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Successfully updated port: 43f04108-bc0f-4bab-b088-501559d95c13 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1549.263707] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "refresh_cache-114bdafc-21f6-4a77-bf19-a444cbd8806c" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1549.263864] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired lock "refresh_cache-114bdafc-21f6-4a77-bf19-a444cbd8806c" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1549.264027] env[69648]: DEBUG nova.network.neutron [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1549.331213] env[69648]: DEBUG nova.network.neutron [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1549.536842] env[69648]: DEBUG nova.network.neutron [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Updating instance_info_cache with network_info: [{"id": "43f04108-bc0f-4bab-b088-501559d95c13", "address": "fa:16:3e:58:d9:91", "network": {"id": "c571b585-f722-4160-a8c7-ee2a6eb60153", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-12468984-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca41677808a749f1b88e43a112db7fb2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43f04108-bc", "ovs_interfaceid": "43f04108-bc0f-4bab-b088-501559d95c13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1549.550241] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Releasing lock "refresh_cache-114bdafc-21f6-4a77-bf19-a444cbd8806c" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1549.550542] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Instance network_info: |[{"id": "43f04108-bc0f-4bab-b088-501559d95c13", "address": "fa:16:3e:58:d9:91", "network": {"id": "c571b585-f722-4160-a8c7-ee2a6eb60153", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-12468984-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca41677808a749f1b88e43a112db7fb2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43f04108-bc", "ovs_interfaceid": "43f04108-bc0f-4bab-b088-501559d95c13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1549.550940] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:58:d9:91', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1a55f45a-d631-4ebc-b73b-8a30bd0a32a8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '43f04108-bc0f-4bab-b088-501559d95c13', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1549.558695] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Creating folder: Project (ca41677808a749f1b88e43a112db7fb2). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1549.559222] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-860e2266-b812-4805-8b99-3d390b14d159 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.571981] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Created folder: Project (ca41677808a749f1b88e43a112db7fb2) in parent group-v692308. [ 1549.571981] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Creating folder: Instances. Parent ref: group-v692391. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1549.571981] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8ab38913-2b67-4657-9529-a7658473e7f2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.580249] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Created folder: Instances in parent group-v692391. [ 1549.580636] env[69648]: DEBUG oslo.service.loopingcall [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1549.580732] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1549.580927] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6889feeb-b92a-4041-99f8-116521c890d0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.599145] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1549.599145] env[69648]: value = "task-3466606" [ 1549.599145] env[69648]: _type = "Task" [ 1549.599145] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1549.606301] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466606, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1549.700450] env[69648]: DEBUG nova.compute.manager [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Received event network-vif-plugged-43f04108-bc0f-4bab-b088-501559d95c13 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1549.700733] env[69648]: DEBUG oslo_concurrency.lockutils [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] Acquiring lock "114bdafc-21f6-4a77-bf19-a444cbd8806c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1549.701239] env[69648]: DEBUG oslo_concurrency.lockutils [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1549.702239] env[69648]: DEBUG oslo_concurrency.lockutils [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1549.702239] env[69648]: DEBUG nova.compute.manager [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] No waiting events found dispatching network-vif-plugged-43f04108-bc0f-4bab-b088-501559d95c13 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1549.702239] env[69648]: WARNING nova.compute.manager [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Received unexpected event network-vif-plugged-43f04108-bc0f-4bab-b088-501559d95c13 for instance with vm_state building and task_state spawning. [ 1549.702239] env[69648]: DEBUG nova.compute.manager [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Received event network-changed-43f04108-bc0f-4bab-b088-501559d95c13 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1549.702421] env[69648]: DEBUG nova.compute.manager [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Refreshing instance network info cache due to event network-changed-43f04108-bc0f-4bab-b088-501559d95c13. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1549.702585] env[69648]: DEBUG oslo_concurrency.lockutils [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] Acquiring lock "refresh_cache-114bdafc-21f6-4a77-bf19-a444cbd8806c" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1549.702789] env[69648]: DEBUG oslo_concurrency.lockutils [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] Acquired lock "refresh_cache-114bdafc-21f6-4a77-bf19-a444cbd8806c" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1549.702958] env[69648]: DEBUG nova.network.neutron [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Refreshing network info cache for port 43f04108-bc0f-4bab-b088-501559d95c13 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1549.972860] env[69648]: DEBUG nova.network.neutron [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Updated VIF entry in instance network info cache for port 43f04108-bc0f-4bab-b088-501559d95c13. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1549.973236] env[69648]: DEBUG nova.network.neutron [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Updating instance_info_cache with network_info: [{"id": "43f04108-bc0f-4bab-b088-501559d95c13", "address": "fa:16:3e:58:d9:91", "network": {"id": "c571b585-f722-4160-a8c7-ee2a6eb60153", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-12468984-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca41677808a749f1b88e43a112db7fb2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43f04108-bc", "ovs_interfaceid": "43f04108-bc0f-4bab-b088-501559d95c13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1549.982820] env[69648]: DEBUG oslo_concurrency.lockutils [req-e1a37ba7-bd25-4c8c-978d-4dff48153224 req-83b7bea7-48da-4ef4-81b0-8bad64fbe2c0 service nova] Releasing lock "refresh_cache-114bdafc-21f6-4a77-bf19-a444cbd8806c" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1550.109311] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466606, 'name': CreateVM_Task, 'duration_secs': 0.289652} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1550.109486] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1550.110138] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1550.110308] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1550.110640] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1550.110880] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-967551ea-f201-488a-8257-48a6c9f765e4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1550.115327] env[69648]: DEBUG oslo_vmware.api [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for the task: (returnval){ [ 1550.115327] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c5d290-d826-288d-a3b6-eb58be4ece79" [ 1550.115327] env[69648]: _type = "Task" [ 1550.115327] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1550.123810] env[69648]: DEBUG oslo_vmware.api [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52c5d290-d826-288d-a3b6-eb58be4ece79, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1550.626176] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1550.626550] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1550.626686] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1554.065078] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1554.065078] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1557.066716] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1558.065232] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1560.060410] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1562.065471] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1562.065760] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1562.065858] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1563.066029] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1563.077540] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1563.077540] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1563.077684] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1563.077779] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1563.079394] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a71418f2-5d21-4a4c-9289-478380f4902a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.088022] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9496f22e-386f-4436-baef-d443a1fb59cc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.101912] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d14948fc-08b4-4d61-9ea8-9fdbfe71f79d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.108170] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c4c3ec3-eb8d-4cd7-acc0-5150633e86bb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.136623] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180968MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1563.136775] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1563.136977] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1563.207027] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.207027] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ab839f84-b864-409e-883d-00dddb5db3db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.207027] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.207027] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.207332] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.207332] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 58804be5-ee46-4b25-be84-890d5cd1607f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.207332] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.207483] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.207483] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.207567] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1563.219854] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1563.229677] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1563.239013] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 97ce6d4b-ad90-47c7-885a-1f6632c8b97d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1563.239232] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1563.239381] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1563.387139] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21c810ca-5ed3-488d-8038-98b6e529da8c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.394447] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25c8dc47-66ad-4973-af80-57f304d3a795 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.424324] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdca8147-4c0a-408e-a502-337ebd36177f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.431303] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb8ae395-1327-4c82-9535-ae54aa8be789 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.443932] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1563.452798] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1563.467250] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1563.467461] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.330s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1566.466749] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1566.467079] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1566.467079] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1566.486419] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.486574] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.486712] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.486843] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.486971] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.487113] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.487239] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.487375] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.487508] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.487628] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1566.487752] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1583.264298] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "723972b1-3f91-4c59-b265-3975644dadb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1583.264589] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "723972b1-3f91-4c59-b265-3975644dadb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1585.886276] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1594.656144] env[69648]: WARNING oslo_vmware.rw_handles [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1594.656144] env[69648]: ERROR oslo_vmware.rw_handles [ 1594.656814] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/edba2ada-6039-495b-a54c-42f1df622997/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1594.658801] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1594.659124] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Copying Virtual Disk [datastore1] vmware_temp/edba2ada-6039-495b-a54c-42f1df622997/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/edba2ada-6039-495b-a54c-42f1df622997/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1594.659461] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-374fac1d-457a-46b1-8843-1ff3c7387b85 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.667194] env[69648]: DEBUG oslo_vmware.api [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Waiting for the task: (returnval){ [ 1594.667194] env[69648]: value = "task-3466607" [ 1594.667194] env[69648]: _type = "Task" [ 1594.667194] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1594.675190] env[69648]: DEBUG oslo_vmware.api [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Task: {'id': task-3466607, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1595.178134] env[69648]: DEBUG oslo_vmware.exceptions [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1595.178442] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1595.179053] env[69648]: ERROR nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1595.179053] env[69648]: Faults: ['InvalidArgument'] [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Traceback (most recent call last): [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] yield resources [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] self.driver.spawn(context, instance, image_meta, [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] self._fetch_image_if_missing(context, vi) [ 1595.179053] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] image_cache(vi, tmp_image_ds_loc) [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] vm_util.copy_virtual_disk( [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] session._wait_for_task(vmdk_copy_task) [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] return self.wait_for_task(task_ref) [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] return evt.wait() [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] result = hub.switch() [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1595.179395] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] return self.greenlet.switch() [ 1595.179731] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1595.179731] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] self.f(*self.args, **self.kw) [ 1595.179731] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1595.179731] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] raise exceptions.translate_fault(task_info.error) [ 1595.179731] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1595.179731] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Faults: ['InvalidArgument'] [ 1595.179731] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] [ 1595.179731] env[69648]: INFO nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Terminating instance [ 1595.180950] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1595.181186] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1595.182027] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5313278d-b383-401d-9587-ad17acea781a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.183843] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1595.184054] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1595.184853] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb6334b6-84a1-490c-bb63-84d0f83c2f22 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.191091] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1595.191280] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f60637b6-174d-4079-895d-d0af7b0c39cc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.193368] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1595.193544] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1595.194475] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dfabc0db-b55f-498f-bc78-4e968144dfdf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.199505] env[69648]: DEBUG oslo_vmware.api [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Waiting for the task: (returnval){ [ 1595.199505] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52510836-3882-3e31-7ec1-856ae7ee2beb" [ 1595.199505] env[69648]: _type = "Task" [ 1595.199505] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1595.206483] env[69648]: DEBUG oslo_vmware.api [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52510836-3882-3e31-7ec1-856ae7ee2beb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1595.259350] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1595.259569] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1595.259863] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Deleting the datastore file [datastore1] d5fb115d-778d-4fc7-a03a-8f5828868a01 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1595.260162] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f598906f-bba6-4634-b648-71a7c9ef0245 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.266493] env[69648]: DEBUG oslo_vmware.api [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Waiting for the task: (returnval){ [ 1595.266493] env[69648]: value = "task-3466609" [ 1595.266493] env[69648]: _type = "Task" [ 1595.266493] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1595.274547] env[69648]: DEBUG oslo_vmware.api [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Task: {'id': task-3466609, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1595.566090] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "18745ec2-477d-427d-b2dd-997f73d9fd53" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1595.566344] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1595.593291] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1595.593514] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1595.711192] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1595.711509] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Creating directory with path [datastore1] vmware_temp/894bf848-495b-4e97-92cf-92a4ce55101a/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1595.711687] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0be69962-cbc4-4666-9e45-be8e1ec01761 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.723474] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Created directory with path [datastore1] vmware_temp/894bf848-495b-4e97-92cf-92a4ce55101a/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1595.723667] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Fetch image to [datastore1] vmware_temp/894bf848-495b-4e97-92cf-92a4ce55101a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1595.723876] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/894bf848-495b-4e97-92cf-92a4ce55101a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1595.724653] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caf5d809-ff34-4ae1-bb5b-c30fb7f11264 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.731122] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d320c6ea-3482-4ade-b5dd-e2dabf73b9e3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.740791] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c37472c2-de1d-44b8-968c-af19b733f4bc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.774988] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-630c6040-0135-467f-b084-a42dae465c32 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.781591] env[69648]: DEBUG oslo_vmware.api [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Task: {'id': task-3466609, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074485} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1595.782962] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1595.783173] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1595.783349] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1595.783524] env[69648]: INFO nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1595.785516] env[69648]: DEBUG nova.compute.claims [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1595.785690] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1595.785902] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1595.788415] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a3b4d03a-17a5-47de-9d04-4b060f9b58cc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.810131] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1595.964420] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1595.965195] env[69648]: ERROR nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] result = getattr(controller, method)(*args, **kwargs) [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._get(image_id) [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1595.965195] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] resp, body = self.http_client.get(url, headers=header) [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.request(url, 'GET', **kwargs) [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._handle_response(resp) [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise exc.from_response(resp, resp.content) [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] During handling of the above exception, another exception occurred: [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1595.965561] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] yield resources [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self.driver.spawn(context, instance, image_meta, [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._fetch_image_if_missing(context, vi) [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] image_fetch(context, vi, tmp_image_ds_loc) [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] images.fetch_image( [ 1595.965856] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] metadata = IMAGE_API.get(context, image_ref) [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return session.show(context, image_id, [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] _reraise_translated_image_exception(image_id) [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise new_exc.with_traceback(exc_trace) [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] result = getattr(controller, method)(*args, **kwargs) [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1595.966183] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._get(image_id) [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] resp, body = self.http_client.get(url, headers=header) [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.request(url, 'GET', **kwargs) [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._handle_response(resp) [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise exc.from_response(resp, resp.content) [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1595.966722] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1595.967193] env[69648]: INFO nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Terminating instance [ 1595.967193] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1595.967287] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1595.970076] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1595.970300] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1595.970564] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7467e63d-ac4d-46f6-9b8a-51bd431d7c53 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.973180] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d6957c8-f6bd-4b90-8607-36913cbffcf1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.981211] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1595.981454] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d8e50fb8-3d09-43aa-8dae-625305c2f320 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.983866] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1595.984564] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1595.985107] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cfc404ef-57eb-4640-b2df-39edfeef6256 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.992151] env[69648]: DEBUG oslo_vmware.api [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for the task: (returnval){ [ 1595.992151] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]529853e7-239f-9c1d-9484-ccdb55a6935b" [ 1595.992151] env[69648]: _type = "Task" [ 1595.992151] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1596.000069] env[69648]: DEBUG oslo_vmware.api [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]529853e7-239f-9c1d-9484-ccdb55a6935b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1596.026222] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3127b443-b39e-4132-9af2-7f1f97d24f15 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.033870] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d6922d7-c94b-4c37-adec-0b273bc19083 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.066179] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94967d8b-d0e6-43ef-a67c-df69bd513725 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.068828] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1596.069033] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1596.069214] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Deleting the datastore file [datastore1] ab839f84-b864-409e-883d-00dddb5db3db {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1596.069455] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-02dc57ed-c4f7-4ec0-b5bc-889581b64b60 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.076329] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e41bec72-de63-4559-af53-c4e97ba1e514 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.080070] env[69648]: DEBUG oslo_vmware.api [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Waiting for the task: (returnval){ [ 1596.080070] env[69648]: value = "task-3466611" [ 1596.080070] env[69648]: _type = "Task" [ 1596.080070] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1596.091409] env[69648]: DEBUG nova.compute.provider_tree [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1596.096879] env[69648]: DEBUG oslo_vmware.api [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Task: {'id': task-3466611, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1596.101859] env[69648]: DEBUG nova.scheduler.client.report [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1596.116926] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.331s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1596.117515] env[69648]: ERROR nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1596.117515] env[69648]: Faults: ['InvalidArgument'] [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Traceback (most recent call last): [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] self.driver.spawn(context, instance, image_meta, [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] self._fetch_image_if_missing(context, vi) [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] image_cache(vi, tmp_image_ds_loc) [ 1596.117515] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] vm_util.copy_virtual_disk( [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] session._wait_for_task(vmdk_copy_task) [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] return self.wait_for_task(task_ref) [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] return evt.wait() [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] result = hub.switch() [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] return self.greenlet.switch() [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1596.117892] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] self.f(*self.args, **self.kw) [ 1596.118218] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1596.118218] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] raise exceptions.translate_fault(task_info.error) [ 1596.118218] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1596.118218] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Faults: ['InvalidArgument'] [ 1596.118218] env[69648]: ERROR nova.compute.manager [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] [ 1596.118333] env[69648]: DEBUG nova.compute.utils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1596.119659] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Build of instance d5fb115d-778d-4fc7-a03a-8f5828868a01 was re-scheduled: A specified parameter was not correct: fileType [ 1596.119659] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1596.120062] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1596.120260] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1596.120435] env[69648]: DEBUG nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1596.120630] env[69648]: DEBUG nova.network.neutron [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1596.502782] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1596.503046] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Creating directory with path [datastore1] vmware_temp/a0333ee3-3974-46ff-8989-3be6b994e05d/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1596.503295] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b3ed5ae0-85c5-4e46-b5ca-a588324a81b8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.514884] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Created directory with path [datastore1] vmware_temp/a0333ee3-3974-46ff-8989-3be6b994e05d/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1596.515113] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Fetch image to [datastore1] vmware_temp/a0333ee3-3974-46ff-8989-3be6b994e05d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1596.515317] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/a0333ee3-3974-46ff-8989-3be6b994e05d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1596.516058] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75fff740-c595-4715-9dd2-e7f5d39381a2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.522794] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f0cb760-9670-4280-832d-cdd64ac29e06 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.532395] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5b09fcc-6159-42dd-aa6d-2e6d77730a21 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.570591] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b528a08f-6267-4d98-a40f-f051b2a502e3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.578151] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4183eefd-3c5e-4fd9-9658-508721f84f4f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.589335] env[69648]: DEBUG oslo_vmware.api [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Task: {'id': task-3466611, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071781} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1596.590908] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1596.590908] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1596.590908] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1596.590908] env[69648]: INFO nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1596.592279] env[69648]: DEBUG nova.compute.claims [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1596.592455] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1596.593497] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1596.597182] env[69648]: DEBUG nova.network.neutron [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1596.598529] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1596.611406] env[69648]: INFO nova.compute.manager [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Took 0.49 seconds to deallocate network for instance. [ 1596.656042] env[69648]: DEBUG oslo_vmware.rw_handles [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a0333ee3-3974-46ff-8989-3be6b994e05d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1596.724346] env[69648]: DEBUG oslo_vmware.rw_handles [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1596.724582] env[69648]: DEBUG oslo_vmware.rw_handles [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a0333ee3-3974-46ff-8989-3be6b994e05d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1596.775982] env[69648]: INFO nova.scheduler.client.report [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Deleted allocations for instance d5fb115d-778d-4fc7-a03a-8f5828868a01 [ 1596.797548] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a01a0d1b-8bb8-49ec-9c1e-41683447fd6a tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 605.440s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1596.799112] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 422.781s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1596.799323] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] During sync_power_state the instance has a pending task (spawning). Skip. [ 1596.799499] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1596.800009] env[69648]: DEBUG oslo_concurrency.lockutils [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 409.519s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1596.800352] env[69648]: DEBUG oslo_concurrency.lockutils [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Acquiring lock "d5fb115d-778d-4fc7-a03a-8f5828868a01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1596.800563] env[69648]: DEBUG oslo_concurrency.lockutils [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1596.800734] env[69648]: DEBUG oslo_concurrency.lockutils [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1596.802769] env[69648]: INFO nova.compute.manager [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Terminating instance [ 1596.804427] env[69648]: DEBUG nova.compute.manager [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1596.804643] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1596.804915] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d181547e-153c-421a-bbd0-d639f730b732 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.810842] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1596.819721] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8cf752f-afc7-45bb-92ae-4be0ceac7a16 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.851818] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d5fb115d-778d-4fc7-a03a-8f5828868a01 could not be found. [ 1596.852076] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1596.852316] env[69648]: INFO nova.compute.manager [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1596.852540] env[69648]: DEBUG oslo.service.loopingcall [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1596.855786] env[69648]: DEBUG nova.compute.manager [-] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1596.855895] env[69648]: DEBUG nova.network.neutron [-] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1596.879516] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1596.886902] env[69648]: DEBUG nova.network.neutron [-] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1596.894795] env[69648]: INFO nova.compute.manager [-] [instance: d5fb115d-778d-4fc7-a03a-8f5828868a01] Took 0.04 seconds to deallocate network for instance. [ 1596.972381] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a565e475-e7ca-4d43-b7a6-cb6e9edb539b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.980575] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a1afaf7-99ad-4bac-b9df-919d56164283 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.988685] env[69648]: DEBUG oslo_concurrency.lockutils [None req-55f606b7-f0d1-4273-a67a-d311a482618e tempest-ServerPasswordTestJSON-1316900133 tempest-ServerPasswordTestJSON-1316900133-project-member] Lock "d5fb115d-778d-4fc7-a03a-8f5828868a01" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.189s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.017192] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dae70ea-6696-4443-9935-8417d092f700 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.024644] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2485e26b-3584-43c6-b4a3-f1564b52a20d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.037869] env[69648]: DEBUG nova.compute.provider_tree [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1597.045976] env[69648]: DEBUG nova.scheduler.client.report [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1597.059235] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.466s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.059926] env[69648]: ERROR nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] result = getattr(controller, method)(*args, **kwargs) [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._get(image_id) [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1597.059926] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] resp, body = self.http_client.get(url, headers=header) [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.request(url, 'GET', **kwargs) [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._handle_response(resp) [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise exc.from_response(resp, resp.content) [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] During handling of the above exception, another exception occurred: [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.060277] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self.driver.spawn(context, instance, image_meta, [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._fetch_image_if_missing(context, vi) [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] image_fetch(context, vi, tmp_image_ds_loc) [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] images.fetch_image( [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] metadata = IMAGE_API.get(context, image_ref) [ 1597.060614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return session.show(context, image_id, [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] _reraise_translated_image_exception(image_id) [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise new_exc.with_traceback(exc_trace) [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] result = getattr(controller, method)(*args, **kwargs) [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._get(image_id) [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1597.060975] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] resp, body = self.http_client.get(url, headers=header) [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.request(url, 'GET', **kwargs) [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._handle_response(resp) [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise exc.from_response(resp, resp.content) [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1597.061340] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.061611] env[69648]: DEBUG nova.compute.utils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1597.061611] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.182s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1597.063051] env[69648]: INFO nova.compute.claims [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1597.065810] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Build of instance ab839f84-b864-409e-883d-00dddb5db3db was re-scheduled: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1597.066104] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1597.066279] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1597.066438] env[69648]: DEBUG nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1597.066599] env[69648]: DEBUG nova.network.neutron [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1597.171546] env[69648]: DEBUG neutronclient.v2_0.client [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=69648) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1597.171738] env[69648]: ERROR nova.compute.manager [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] result = getattr(controller, method)(*args, **kwargs) [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._get(image_id) [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1597.171738] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] resp, body = self.http_client.get(url, headers=header) [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.request(url, 'GET', **kwargs) [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._handle_response(resp) [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise exc.from_response(resp, resp.content) [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] During handling of the above exception, another exception occurred: [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.172031] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self.driver.spawn(context, instance, image_meta, [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._fetch_image_if_missing(context, vi) [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] image_fetch(context, vi, tmp_image_ds_loc) [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] images.fetch_image( [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] metadata = IMAGE_API.get(context, image_ref) [ 1597.172313] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return session.show(context, image_id, [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] _reraise_translated_image_exception(image_id) [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise new_exc.with_traceback(exc_trace) [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] result = getattr(controller, method)(*args, **kwargs) [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._get(image_id) [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1597.172614] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] resp, body = self.http_client.get(url, headers=header) [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.request(url, 'GET', **kwargs) [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self._handle_response(resp) [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise exc.from_response(resp, resp.content) [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] During handling of the above exception, another exception occurred: [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.172898] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 2431, in _do_build_and_run_instance [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._build_and_run_instance(context, instance, image, [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 2723, in _build_and_run_instance [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise exception.RescheduledException( [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] nova.exception.RescheduledException: Build of instance ab839f84-b864-409e-883d-00dddb5db3db was re-scheduled: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] During handling of the above exception, another exception occurred: [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1597.173198] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] exception_handler_v20(status_code, error_body) [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise client_exc(message=error_message, [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Neutron server returns request_ids: ['req-3d52f2ad-fe5e-40c0-8b9a-39e74e782870'] [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] During handling of the above exception, another exception occurred: [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 3020, in _cleanup_allocated_networks [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._deallocate_network(context, instance, requested_networks) [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self.network_api.deallocate_for_instance( [ 1597.173529] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] data = neutron.list_ports(**search_opts) [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.list('ports', self.ports_path, retrieve_all, [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] for r in self._pagination(collection, path, **params): [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] res = self.get(path, params=params) [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.173858] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.retry_request("GET", action, body=body, [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.do_request(method, action, body=body, [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._handle_fault_response(status_code, replybody, resp) [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise exception.Unauthorized() [ 1597.174167] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] nova.exception.Unauthorized: Not authorized. [ 1597.174482] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.226435] env[69648]: INFO nova.scheduler.client.report [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Deleted allocations for instance ab839f84-b864-409e-883d-00dddb5db3db [ 1597.247777] env[69648]: DEBUG oslo_concurrency.lockutils [None req-de005f76-6853-43f5-b48a-7f118ec50261 tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "ab839f84-b864-409e-883d-00dddb5db3db" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 571.295s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.248912] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "ab839f84-b864-409e-883d-00dddb5db3db" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 376.015s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1597.249135] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Acquiring lock "ab839f84-b864-409e-883d-00dddb5db3db-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1597.249356] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "ab839f84-b864-409e-883d-00dddb5db3db-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1597.250048] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "ab839f84-b864-409e-883d-00dddb5db3db-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.251448] env[69648]: INFO nova.compute.manager [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Terminating instance [ 1597.255556] env[69648]: DEBUG nova.compute.manager [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1597.255765] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1597.256030] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d21b558d-784d-4522-a6da-340dd83ca5f0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.259410] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1597.269409] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-820f7d5f-4d84-463e-b8ad-357491091551 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.300726] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ab839f84-b864-409e-883d-00dddb5db3db could not be found. [ 1597.300930] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1597.301914] env[69648]: INFO nova.compute.manager [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1597.301914] env[69648]: DEBUG oslo.service.loopingcall [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1597.304588] env[69648]: DEBUG nova.compute.manager [-] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1597.304718] env[69648]: DEBUG nova.network.neutron [-] [instance: ab839f84-b864-409e-883d-00dddb5db3db] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1597.320942] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1597.330739] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-065bde4d-c6de-4209-954d-34988f6bb2d7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.339198] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b1a9e35-8ff1-4d11-a5c4-82df0136d0b6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.370738] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed898377-3a7f-4c83-a38b-c9218ba340ff {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.378390] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32868282-fe70-4607-91cb-c0bb24af0c11 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.391801] env[69648]: DEBUG nova.compute.provider_tree [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1597.400394] env[69648]: DEBUG nova.scheduler.client.report [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1597.417911] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.418415] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1597.420907] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.100s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1597.422214] env[69648]: INFO nova.compute.claims [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1597.424929] env[69648]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=69648) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1597.425177] env[69648]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-a9261f4f-3601-420c-94b1-1fedb11fecdb'] [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1597.427416] env[69648]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1597.427890] env[69648]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1597.428305] env[69648]: ERROR oslo.service.loopingcall [ 1597.428664] env[69648]: ERROR nova.compute.manager [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1597.474461] env[69648]: ERROR nova.compute.manager [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] exception_handler_v20(status_code, error_body) [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise client_exc(message=error_message, [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Neutron server returns request_ids: ['req-a9261f4f-3601-420c-94b1-1fedb11fecdb'] [ 1597.474461] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] During handling of the above exception, another exception occurred: [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] Traceback (most recent call last): [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._delete_instance(context, instance, bdms) [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._shutdown_instance(context, instance, bdms) [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._try_deallocate_network(context, instance, requested_networks) [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] with excutils.save_and_reraise_exception(): [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1597.474801] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self.force_reraise() [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise self.value [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] _deallocate_network_with_retries() [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return evt.wait() [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] result = hub.switch() [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.greenlet.switch() [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1597.475168] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] result = func(*self.args, **self.kw) [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] result = f(*args, **kwargs) [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._deallocate_network( [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self.network_api.deallocate_for_instance( [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] data = neutron.list_ports(**search_opts) [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.list('ports', self.ports_path, retrieve_all, [ 1597.475454] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] for r in self._pagination(collection, path, **params): [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] res = self.get(path, params=params) [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.retry_request("GET", action, body=body, [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1597.475755] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] return self.do_request(method, action, body=body, [ 1597.476061] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.476061] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] ret = obj(*args, **kwargs) [ 1597.476061] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1597.476061] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] self._handle_fault_response(status_code, replybody, resp) [ 1597.476061] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1597.476061] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1597.476061] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1597.476061] env[69648]: ERROR nova.compute.manager [instance: ab839f84-b864-409e-883d-00dddb5db3db] [ 1597.485301] env[69648]: DEBUG nova.compute.utils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1597.486460] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "114bdafc-21f6-4a77-bf19-a444cbd8806c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1597.487508] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1597.487508] env[69648]: DEBUG nova.network.neutron [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1597.495626] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1597.503793] env[69648]: DEBUG oslo_concurrency.lockutils [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Lock "ab839f84-b864-409e-883d-00dddb5db3db" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.255s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.560841] env[69648]: DEBUG nova.policy [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e0d687f3eb0b41dbb79bcbc837ea5aef', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '36dc70105e714e63853cee7494aab144', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1597.575661] env[69648]: INFO nova.compute.manager [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] [instance: ab839f84-b864-409e-883d-00dddb5db3db] Successfully reverted task state from None on failure for instance. [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server [None req-d156cef4-67ce-458f-ad13-4d4f06a2a4db tempest-DeleteServersAdminTestJSON-897165489 tempest-DeleteServersAdminTestJSON-897165489-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-a9261f4f-3601-420c-94b1-1fedb11fecdb'] [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1597.581793] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1597.582419] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1597.583064] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 1597.583551] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1597.584027] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1597.584459] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1597.584921] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1597.584921] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1597.584921] env[69648]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1597.584921] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1597.584921] env[69648]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1597.584921] env[69648]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1597.584921] env[69648]: ERROR oslo_messaging.rpc.server [ 1597.586989] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1597.615336] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1597.615588] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1597.615784] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1597.615940] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1597.616101] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1597.616252] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1597.616473] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1597.616694] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1597.616875] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1597.617052] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1597.617232] env[69648]: DEBUG nova.virt.hardware [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1597.618152] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52ee06ba-a45e-49fe-bdbd-d0ce685f3e1f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.626517] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-982f5e1d-ef8b-45ce-b58c-525c359dbedf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.679210] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21a661ab-dc5f-430b-9814-ac6a9dfbb0d3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.686213] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f881a93-0650-4def-802a-ec7ca5bf04ce {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.718350] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9c363d4-439b-4b91-914e-cf87e695be83 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.725659] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f8aaf29-fdba-4d6a-b73d-254112b045ef {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.738993] env[69648]: DEBUG nova.compute.provider_tree [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1597.747891] env[69648]: DEBUG nova.scheduler.client.report [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1597.762487] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.341s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.763013] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1597.807497] env[69648]: DEBUG nova.compute.utils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1597.808821] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1597.809020] env[69648]: DEBUG nova.network.neutron [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1597.818423] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1597.882764] env[69648]: DEBUG nova.policy [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dcce409aea2f4744bda144de55e46052', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd1ecc20de6ab4597a08d93cca45ed56c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1597.889028] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1597.909697] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1597.909945] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1597.910116] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1597.910302] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1597.910452] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1597.910601] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1597.910807] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1597.910969] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1597.911249] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1597.911460] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1597.911680] env[69648]: DEBUG nova.virt.hardware [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1597.912561] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9c28b09-4d18-47a2-9cb1-3d4c2720aa54 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.921023] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38b23d3c-320a-4c0f-b144-b4850d4b02bc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.966879] env[69648]: DEBUG nova.network.neutron [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Successfully created port: c92c43ff-32d1-4a09-9980-652fcc0f7f14 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1598.255279] env[69648]: DEBUG nova.network.neutron [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Successfully created port: 4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1598.681482] env[69648]: DEBUG nova.network.neutron [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Successfully updated port: c92c43ff-32d1-4a09-9980-652fcc0f7f14 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1598.700907] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquiring lock "refresh_cache-0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1598.701316] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquired lock "refresh_cache-0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1598.701316] env[69648]: DEBUG nova.network.neutron [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1598.749640] env[69648]: DEBUG nova.network.neutron [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1598.826818] env[69648]: DEBUG nova.compute.manager [req-8f9caa0b-70eb-4706-bdb8-ecda9190d0e7 req-2926d7b5-64dd-4728-abf9-028639636f7b service nova] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Received event network-vif-plugged-c92c43ff-32d1-4a09-9980-652fcc0f7f14 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1598.827097] env[69648]: DEBUG oslo_concurrency.lockutils [req-8f9caa0b-70eb-4706-bdb8-ecda9190d0e7 req-2926d7b5-64dd-4728-abf9-028639636f7b service nova] Acquiring lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1598.827263] env[69648]: DEBUG oslo_concurrency.lockutils [req-8f9caa0b-70eb-4706-bdb8-ecda9190d0e7 req-2926d7b5-64dd-4728-abf9-028639636f7b service nova] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1598.828018] env[69648]: DEBUG oslo_concurrency.lockutils [req-8f9caa0b-70eb-4706-bdb8-ecda9190d0e7 req-2926d7b5-64dd-4728-abf9-028639636f7b service nova] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1598.828018] env[69648]: DEBUG nova.compute.manager [req-8f9caa0b-70eb-4706-bdb8-ecda9190d0e7 req-2926d7b5-64dd-4728-abf9-028639636f7b service nova] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] No waiting events found dispatching network-vif-plugged-c92c43ff-32d1-4a09-9980-652fcc0f7f14 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1598.828018] env[69648]: WARNING nova.compute.manager [req-8f9caa0b-70eb-4706-bdb8-ecda9190d0e7 req-2926d7b5-64dd-4728-abf9-028639636f7b service nova] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Received unexpected event network-vif-plugged-c92c43ff-32d1-4a09-9980-652fcc0f7f14 for instance with vm_state building and task_state spawning. [ 1598.894846] env[69648]: DEBUG nova.compute.manager [req-54b9eaff-1941-44e4-9536-d3b3f9326826 req-e1e1e6ca-463d-470c-9206-4d84eebd551e service nova] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Received event network-vif-plugged-4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1598.895099] env[69648]: DEBUG oslo_concurrency.lockutils [req-54b9eaff-1941-44e4-9536-d3b3f9326826 req-e1e1e6ca-463d-470c-9206-4d84eebd551e service nova] Acquiring lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1598.895314] env[69648]: DEBUG oslo_concurrency.lockutils [req-54b9eaff-1941-44e4-9536-d3b3f9326826 req-e1e1e6ca-463d-470c-9206-4d84eebd551e service nova] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1598.895481] env[69648]: DEBUG oslo_concurrency.lockutils [req-54b9eaff-1941-44e4-9536-d3b3f9326826 req-e1e1e6ca-463d-470c-9206-4d84eebd551e service nova] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1598.895646] env[69648]: DEBUG nova.compute.manager [req-54b9eaff-1941-44e4-9536-d3b3f9326826 req-e1e1e6ca-463d-470c-9206-4d84eebd551e service nova] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] No waiting events found dispatching network-vif-plugged-4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1598.895819] env[69648]: WARNING nova.compute.manager [req-54b9eaff-1941-44e4-9536-d3b3f9326826 req-e1e1e6ca-463d-470c-9206-4d84eebd551e service nova] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Received unexpected event network-vif-plugged-4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2 for instance with vm_state building and task_state spawning. [ 1598.950085] env[69648]: DEBUG nova.network.neutron [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Successfully updated port: 4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1598.961983] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "refresh_cache-bde8a72e-0ed5-4794-badf-0bc54c4c408b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1598.962167] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired lock "refresh_cache-bde8a72e-0ed5-4794-badf-0bc54c4c408b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1598.962326] env[69648]: DEBUG nova.network.neutron [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1599.030185] env[69648]: DEBUG nova.network.neutron [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1599.175984] env[69648]: DEBUG nova.network.neutron [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Updating instance_info_cache with network_info: [{"id": "c92c43ff-32d1-4a09-9980-652fcc0f7f14", "address": "fa:16:3e:16:bd:14", "network": {"id": "234778e1-d849-49e4-9e8c-60f25f3877a1", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-677813074-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "36dc70105e714e63853cee7494aab144", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3db2ab9e-1244-4377-b05f-ab76003f2428", "external-id": "nsx-vlan-transportzone-199", "segmentation_id": 199, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc92c43ff-32", "ovs_interfaceid": "c92c43ff-32d1-4a09-9980-652fcc0f7f14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1599.189447] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Releasing lock "refresh_cache-0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1599.189750] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Instance network_info: |[{"id": "c92c43ff-32d1-4a09-9980-652fcc0f7f14", "address": "fa:16:3e:16:bd:14", "network": {"id": "234778e1-d849-49e4-9e8c-60f25f3877a1", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-677813074-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "36dc70105e714e63853cee7494aab144", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3db2ab9e-1244-4377-b05f-ab76003f2428", "external-id": "nsx-vlan-transportzone-199", "segmentation_id": 199, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc92c43ff-32", "ovs_interfaceid": "c92c43ff-32d1-4a09-9980-652fcc0f7f14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1599.190228] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:16:bd:14', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3db2ab9e-1244-4377-b05f-ab76003f2428', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c92c43ff-32d1-4a09-9980-652fcc0f7f14', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1599.197985] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Creating folder: Project (36dc70105e714e63853cee7494aab144). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1599.198509] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2f229006-59fa-4b7f-9f43-89c84d8eace7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.211012] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Created folder: Project (36dc70105e714e63853cee7494aab144) in parent group-v692308. [ 1599.211224] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Creating folder: Instances. Parent ref: group-v692394. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1599.211435] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a71e862d-dc38-48c7-ab5a-d36608b266c4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.220013] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Created folder: Instances in parent group-v692394. [ 1599.220323] env[69648]: DEBUG oslo.service.loopingcall [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1599.220536] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1599.220743] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-236ab451-44a1-431f-9625-41831a263f09 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.242305] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1599.242305] env[69648]: value = "task-3466614" [ 1599.242305] env[69648]: _type = "Task" [ 1599.242305] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1599.249594] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466614, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1599.263644] env[69648]: DEBUG nova.network.neutron [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Updating instance_info_cache with network_info: [{"id": "4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2", "address": "fa:16:3e:ad:0c:37", "network": {"id": "fc9dfa48-a79d-4532-a3ed-e1ad779b2906", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-685310351-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d1ecc20de6ab4597a08d93cca45ed56c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee018eb-75be-4037-a80a-07034d4eae35", "external-id": "nsx-vlan-transportzone-8", "segmentation_id": 8, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4e69f2c0-f7", "ovs_interfaceid": "4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1599.274463] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Releasing lock "refresh_cache-bde8a72e-0ed5-4794-badf-0bc54c4c408b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1599.274786] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Instance network_info: |[{"id": "4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2", "address": "fa:16:3e:ad:0c:37", "network": {"id": "fc9dfa48-a79d-4532-a3ed-e1ad779b2906", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-685310351-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d1ecc20de6ab4597a08d93cca45ed56c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee018eb-75be-4037-a80a-07034d4eae35", "external-id": "nsx-vlan-transportzone-8", "segmentation_id": 8, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4e69f2c0-f7", "ovs_interfaceid": "4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1599.275192] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ad:0c:37', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2ee018eb-75be-4037-a80a-07034d4eae35', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1599.282514] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Creating folder: Project (d1ecc20de6ab4597a08d93cca45ed56c). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1599.282967] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6e845d9-8c28-4fc5-999a-d9d4cfff66fc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.292632] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Created folder: Project (d1ecc20de6ab4597a08d93cca45ed56c) in parent group-v692308. [ 1599.292841] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Creating folder: Instances. Parent ref: group-v692397. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1599.293119] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e21711a2-104b-4c38-b3f0-fc6844bef6c9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.302610] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Created folder: Instances in parent group-v692397. [ 1599.302839] env[69648]: DEBUG oslo.service.loopingcall [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1599.303041] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1599.303267] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-44ed00df-395f-4f8c-86fa-872204096998 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.322077] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1599.322077] env[69648]: value = "task-3466617" [ 1599.322077] env[69648]: _type = "Task" [ 1599.322077] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1599.329601] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466617, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1599.752452] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466614, 'name': CreateVM_Task, 'duration_secs': 0.272761} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1599.752756] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1599.753338] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1599.753503] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1599.753809] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1599.754064] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-24911d7e-7a69-4587-941d-8dfa3f247685 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.758174] env[69648]: DEBUG oslo_vmware.api [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Waiting for the task: (returnval){ [ 1599.758174] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52b9fc3b-c068-4aa6-5cbe-13aeca4bf504" [ 1599.758174] env[69648]: _type = "Task" [ 1599.758174] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1599.765369] env[69648]: DEBUG oslo_vmware.api [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52b9fc3b-c068-4aa6-5cbe-13aeca4bf504, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1599.831662] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466617, 'name': CreateVM_Task, 'duration_secs': 0.278899} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1599.831846] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1599.832502] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1600.269337] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1600.269603] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1600.269818] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1600.270042] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1600.270357] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1600.270608] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c30ebdc1-7c99-492f-8baa-bd74fdbfc94e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1600.275376] env[69648]: DEBUG oslo_vmware.api [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for the task: (returnval){ [ 1600.275376] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]522cdfdf-9598-c1ec-9501-c6028b55825d" [ 1600.275376] env[69648]: _type = "Task" [ 1600.275376] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1600.282581] env[69648]: DEBUG oslo_vmware.api [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]522cdfdf-9598-c1ec-9501-c6028b55825d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1600.786566] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1600.786979] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1600.787083] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1600.850749] env[69648]: DEBUG nova.compute.manager [req-ba8e4dde-6e1e-4df4-843b-d4fe925e1161 req-354be256-6357-49be-8578-0a6a745a7fce service nova] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Received event network-changed-c92c43ff-32d1-4a09-9980-652fcc0f7f14 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1600.850957] env[69648]: DEBUG nova.compute.manager [req-ba8e4dde-6e1e-4df4-843b-d4fe925e1161 req-354be256-6357-49be-8578-0a6a745a7fce service nova] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Refreshing instance network info cache due to event network-changed-c92c43ff-32d1-4a09-9980-652fcc0f7f14. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1600.851192] env[69648]: DEBUG oslo_concurrency.lockutils [req-ba8e4dde-6e1e-4df4-843b-d4fe925e1161 req-354be256-6357-49be-8578-0a6a745a7fce service nova] Acquiring lock "refresh_cache-0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1600.851343] env[69648]: DEBUG oslo_concurrency.lockutils [req-ba8e4dde-6e1e-4df4-843b-d4fe925e1161 req-354be256-6357-49be-8578-0a6a745a7fce service nova] Acquired lock "refresh_cache-0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1600.851505] env[69648]: DEBUG nova.network.neutron [req-ba8e4dde-6e1e-4df4-843b-d4fe925e1161 req-354be256-6357-49be-8578-0a6a745a7fce service nova] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Refreshing network info cache for port c92c43ff-32d1-4a09-9980-652fcc0f7f14 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1600.946423] env[69648]: DEBUG nova.compute.manager [req-cfe590f2-fe2b-40f2-8a81-677b9aefa99e req-ceea2d02-370f-4aa1-9226-85dc99c29f1c service nova] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Received event network-changed-4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1600.946671] env[69648]: DEBUG nova.compute.manager [req-cfe590f2-fe2b-40f2-8a81-677b9aefa99e req-ceea2d02-370f-4aa1-9226-85dc99c29f1c service nova] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Refreshing instance network info cache due to event network-changed-4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1600.947042] env[69648]: DEBUG oslo_concurrency.lockutils [req-cfe590f2-fe2b-40f2-8a81-677b9aefa99e req-ceea2d02-370f-4aa1-9226-85dc99c29f1c service nova] Acquiring lock "refresh_cache-bde8a72e-0ed5-4794-badf-0bc54c4c408b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1600.947216] env[69648]: DEBUG oslo_concurrency.lockutils [req-cfe590f2-fe2b-40f2-8a81-677b9aefa99e req-ceea2d02-370f-4aa1-9226-85dc99c29f1c service nova] Acquired lock "refresh_cache-bde8a72e-0ed5-4794-badf-0bc54c4c408b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1600.947388] env[69648]: DEBUG nova.network.neutron [req-cfe590f2-fe2b-40f2-8a81-677b9aefa99e req-ceea2d02-370f-4aa1-9226-85dc99c29f1c service nova] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Refreshing network info cache for port 4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1601.200279] env[69648]: DEBUG nova.network.neutron [req-ba8e4dde-6e1e-4df4-843b-d4fe925e1161 req-354be256-6357-49be-8578-0a6a745a7fce service nova] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Updated VIF entry in instance network info cache for port c92c43ff-32d1-4a09-9980-652fcc0f7f14. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1601.200646] env[69648]: DEBUG nova.network.neutron [req-ba8e4dde-6e1e-4df4-843b-d4fe925e1161 req-354be256-6357-49be-8578-0a6a745a7fce service nova] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Updating instance_info_cache with network_info: [{"id": "c92c43ff-32d1-4a09-9980-652fcc0f7f14", "address": "fa:16:3e:16:bd:14", "network": {"id": "234778e1-d849-49e4-9e8c-60f25f3877a1", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-677813074-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "36dc70105e714e63853cee7494aab144", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3db2ab9e-1244-4377-b05f-ab76003f2428", "external-id": "nsx-vlan-transportzone-199", "segmentation_id": 199, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc92c43ff-32", "ovs_interfaceid": "c92c43ff-32d1-4a09-9980-652fcc0f7f14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1601.212450] env[69648]: DEBUG oslo_concurrency.lockutils [req-ba8e4dde-6e1e-4df4-843b-d4fe925e1161 req-354be256-6357-49be-8578-0a6a745a7fce service nova] Releasing lock "refresh_cache-0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1601.225250] env[69648]: DEBUG nova.network.neutron [req-cfe590f2-fe2b-40f2-8a81-677b9aefa99e req-ceea2d02-370f-4aa1-9226-85dc99c29f1c service nova] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Updated VIF entry in instance network info cache for port 4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1601.225697] env[69648]: DEBUG nova.network.neutron [req-cfe590f2-fe2b-40f2-8a81-677b9aefa99e req-ceea2d02-370f-4aa1-9226-85dc99c29f1c service nova] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Updating instance_info_cache with network_info: [{"id": "4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2", "address": "fa:16:3e:ad:0c:37", "network": {"id": "fc9dfa48-a79d-4532-a3ed-e1ad779b2906", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-685310351-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d1ecc20de6ab4597a08d93cca45ed56c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee018eb-75be-4037-a80a-07034d4eae35", "external-id": "nsx-vlan-transportzone-8", "segmentation_id": 8, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4e69f2c0-f7", "ovs_interfaceid": "4e69f2c0-f74f-4af0-b4bb-e14b4249b9c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1601.255987] env[69648]: DEBUG oslo_concurrency.lockutils [req-cfe590f2-fe2b-40f2-8a81-677b9aefa99e req-ceea2d02-370f-4aa1-9226-85dc99c29f1c service nova] Releasing lock "refresh_cache-bde8a72e-0ed5-4794-badf-0bc54c4c408b" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1604.162579] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "0a8de1d1-a783-4a32-9ee0-abb023943eeb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1604.162912] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "0a8de1d1-a783-4a32-9ee0-abb023943eeb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1614.065088] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1615.060330] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1616.065437] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1618.065548] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1618.792954] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquiring lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1620.064919] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1620.164176] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1621.060372] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1623.066052] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1623.066052] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1623.066052] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1623.066447] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1623.077341] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1623.077667] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1623.077718] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1623.077883] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1623.079013] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd3cf8cd-d68b-49ad-b04b-4dcfbaf6c919 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.087947] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5811cd81-5307-471f-9682-e643fca26a75 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.101996] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-800c5093-58d6-45c2-8915-5c4a091cfb58 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.108350] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-176621e2-f21a-4d7a-92b3-5f867c48825d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.137052] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180940MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1623.137234] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1623.137437] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1623.212593] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.212777] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.212903] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.213035] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 58804be5-ee46-4b25-be84-890d5cd1607f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.213311] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.213311] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.213457] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.213508] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.213623] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.213706] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1623.225779] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 97ce6d4b-ad90-47c7-885a-1f6632c8b97d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1623.235188] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 723972b1-3f91-4c59-b265-3975644dadb2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1623.245562] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1623.256431] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1623.266785] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1623.266931] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1623.267042] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1623.434451] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6484ef04-ad2e-4a1c-8d69-a1f25520f5e9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.444434] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd7a873b-ff8b-49d9-8c0e-9d36abf14f4d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.472994] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88a6b40f-d115-4457-9167-12c3b433cf03 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.479948] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f60dab2-e22c-4670-abe3-bd06d05340e3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.492727] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1623.501130] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1623.514589] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1623.514794] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.377s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.514538] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1626.514842] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1626.514842] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1626.536251] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.536417] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.536537] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.536668] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.536796] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.536980] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.537139] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.537266] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.537387] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.537727] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1626.537727] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1637.009613] env[69648]: DEBUG oslo_concurrency.lockutils [None req-19214503-4e9f-4079-8707-b20bcb05e730 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "3edf3b50-a4bf-4e75-927a-db78c433dbc4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1637.009954] env[69648]: DEBUG oslo_concurrency.lockutils [None req-19214503-4e9f-4079-8707-b20bcb05e730 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "3edf3b50-a4bf-4e75-927a-db78c433dbc4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1644.646841] env[69648]: WARNING oslo_vmware.rw_handles [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1644.646841] env[69648]: ERROR oslo_vmware.rw_handles [ 1644.647524] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/a0333ee3-3974-46ff-8989-3be6b994e05d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1644.649482] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1644.649730] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Copying Virtual Disk [datastore1] vmware_temp/a0333ee3-3974-46ff-8989-3be6b994e05d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/a0333ee3-3974-46ff-8989-3be6b994e05d/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1644.650026] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-89236597-79b3-4080-85da-8077b9f51683 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.659400] env[69648]: DEBUG oslo_vmware.api [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for the task: (returnval){ [ 1644.659400] env[69648]: value = "task-3466618" [ 1644.659400] env[69648]: _type = "Task" [ 1644.659400] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1644.666882] env[69648]: DEBUG oslo_vmware.api [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': task-3466618, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1645.170506] env[69648]: DEBUG oslo_vmware.exceptions [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1645.170787] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1645.171369] env[69648]: ERROR nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1645.171369] env[69648]: Faults: ['InvalidArgument'] [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Traceback (most recent call last): [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] yield resources [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] self.driver.spawn(context, instance, image_meta, [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] self._fetch_image_if_missing(context, vi) [ 1645.171369] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] image_cache(vi, tmp_image_ds_loc) [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] vm_util.copy_virtual_disk( [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] session._wait_for_task(vmdk_copy_task) [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] return self.wait_for_task(task_ref) [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] return evt.wait() [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] result = hub.switch() [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1645.171757] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] return self.greenlet.switch() [ 1645.172219] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1645.172219] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] self.f(*self.args, **self.kw) [ 1645.172219] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1645.172219] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] raise exceptions.translate_fault(task_info.error) [ 1645.172219] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1645.172219] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Faults: ['InvalidArgument'] [ 1645.172219] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] [ 1645.172219] env[69648]: INFO nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Terminating instance [ 1645.173948] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1645.173948] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1645.173948] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f6c262c4-8797-4bb2-8be8-3b8dafa9b218 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.176081] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1645.176281] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1645.177028] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0c4518e-483e-41b8-b870-38a910a6b734 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.184887] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1645.185884] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e8fe3417-1f54-4ec2-ac5c-5f3f935871ed {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.187358] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1645.187525] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1645.188196] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-14041717-f597-40ca-8431-989cc1a7731d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.192841] env[69648]: DEBUG oslo_vmware.api [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for the task: (returnval){ [ 1645.192841] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524da2e8-b68b-bf4a-f84e-d28d37d2d514" [ 1645.192841] env[69648]: _type = "Task" [ 1645.192841] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1645.200017] env[69648]: DEBUG oslo_vmware.api [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524da2e8-b68b-bf4a-f84e-d28d37d2d514, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1645.258022] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1645.258022] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1645.258022] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Deleting the datastore file [datastore1] 147e1f39-c2ae-410e-9b62-cd56b5978e1b {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1645.258022] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ca807683-51e1-4e15-afc2-297091443589 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.263406] env[69648]: DEBUG oslo_vmware.api [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for the task: (returnval){ [ 1645.263406] env[69648]: value = "task-3466620" [ 1645.263406] env[69648]: _type = "Task" [ 1645.263406] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1645.272562] env[69648]: DEBUG oslo_vmware.api [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': task-3466620, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1645.703625] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1645.703936] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Creating directory with path [datastore1] vmware_temp/e0cf3896-a435-48a5-b398-d84f5f49161d/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1645.703984] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-760ce179-f827-4ba4-ab8a-8471c3f71f8c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.714944] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Created directory with path [datastore1] vmware_temp/e0cf3896-a435-48a5-b398-d84f5f49161d/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1645.715149] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Fetch image to [datastore1] vmware_temp/e0cf3896-a435-48a5-b398-d84f5f49161d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1645.715325] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/e0cf3896-a435-48a5-b398-d84f5f49161d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1645.716096] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e969e77c-2915-4ca8-932a-28cf091a1c34 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.722358] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d09a08a4-5565-458f-9986-cffa0672602e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.731221] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-654c1158-a482-4910-83fd-7de710409230 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.762462] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1790731f-1a48-4840-972a-e3ce02647c5b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.772723] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b7a60233-9093-4201-a566-005638950e0b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.774364] env[69648]: DEBUG oslo_vmware.api [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': task-3466620, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075022} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1645.774588] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1645.774769] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1645.774944] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1645.775122] env[69648]: INFO nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1645.777135] env[69648]: DEBUG nova.compute.claims [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1645.777302] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1645.777517] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1645.798493] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1645.848592] env[69648]: DEBUG oslo_vmware.rw_handles [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e0cf3896-a435-48a5-b398-d84f5f49161d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1645.909731] env[69648]: DEBUG oslo_vmware.rw_handles [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1645.909996] env[69648]: DEBUG oslo_vmware.rw_handles [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e0cf3896-a435-48a5-b398-d84f5f49161d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1646.052496] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca40f525-c43c-4d01-8890-f5f556a21cb3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.060317] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6676de0-bc4c-470e-89a3-95169c75ed18 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.089666] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cc16895-bfb7-4f5c-bfef-d0d86297d45e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.096903] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7882ba76-d9fb-4f6c-98c0-6d61de0e40f4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.109734] env[69648]: DEBUG nova.compute.provider_tree [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1646.118585] env[69648]: DEBUG nova.scheduler.client.report [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1646.134554] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.357s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1646.135137] env[69648]: ERROR nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1646.135137] env[69648]: Faults: ['InvalidArgument'] [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Traceback (most recent call last): [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] self.driver.spawn(context, instance, image_meta, [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] self._fetch_image_if_missing(context, vi) [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] image_cache(vi, tmp_image_ds_loc) [ 1646.135137] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] vm_util.copy_virtual_disk( [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] session._wait_for_task(vmdk_copy_task) [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] return self.wait_for_task(task_ref) [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] return evt.wait() [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] result = hub.switch() [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] return self.greenlet.switch() [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1646.135589] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] self.f(*self.args, **self.kw) [ 1646.135905] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1646.135905] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] raise exceptions.translate_fault(task_info.error) [ 1646.135905] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1646.135905] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Faults: ['InvalidArgument'] [ 1646.135905] env[69648]: ERROR nova.compute.manager [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] [ 1646.135905] env[69648]: DEBUG nova.compute.utils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1646.137273] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Build of instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b was re-scheduled: A specified parameter was not correct: fileType [ 1646.137273] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1646.137645] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1646.137822] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1646.137980] env[69648]: DEBUG nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1646.138169] env[69648]: DEBUG nova.network.neutron [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1646.403670] env[69648]: DEBUG nova.network.neutron [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1646.414657] env[69648]: INFO nova.compute.manager [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Took 0.28 seconds to deallocate network for instance. [ 1646.505920] env[69648]: INFO nova.scheduler.client.report [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Deleted allocations for instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b [ 1646.531032] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c4182fa4-d708-4549-8e63-1ab3f66ed797 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 518.256s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1646.532066] env[69648]: DEBUG oslo_concurrency.lockutils [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 126.284s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1646.532295] env[69648]: DEBUG oslo_concurrency.lockutils [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1646.532535] env[69648]: DEBUG oslo_concurrency.lockutils [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1646.532709] env[69648]: DEBUG oslo_concurrency.lockutils [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1646.535192] env[69648]: INFO nova.compute.manager [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Terminating instance [ 1646.536711] env[69648]: DEBUG nova.compute.manager [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1646.536911] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1646.537942] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-01a54b4a-c223-4c7e-87b7-8f52510e6266 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.545809] env[69648]: DEBUG nova.compute.manager [None req-9385912d-383d-40d0-ac55-1e66f7f4de1b tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 97ce6d4b-ad90-47c7-885a-1f6632c8b97d] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1646.551331] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21dd1ab9-6fa6-44b5-860f-4d68c58ed525 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.570636] env[69648]: DEBUG nova.compute.manager [None req-9385912d-383d-40d0-ac55-1e66f7f4de1b tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 97ce6d4b-ad90-47c7-885a-1f6632c8b97d] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1646.581705] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 147e1f39-c2ae-410e-9b62-cd56b5978e1b could not be found. [ 1646.581924] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1646.582311] env[69648]: INFO nova.compute.manager [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1646.582382] env[69648]: DEBUG oslo.service.loopingcall [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1646.582675] env[69648]: DEBUG nova.compute.manager [-] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1646.582742] env[69648]: DEBUG nova.network.neutron [-] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1646.601718] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9385912d-383d-40d0-ac55-1e66f7f4de1b tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "97ce6d4b-ad90-47c7-885a-1f6632c8b97d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.754s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1646.611878] env[69648]: DEBUG nova.network.neutron [-] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1646.613358] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1646.618522] env[69648]: INFO nova.compute.manager [-] [instance: 147e1f39-c2ae-410e-9b62-cd56b5978e1b] Took 0.04 seconds to deallocate network for instance. [ 1646.668360] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1646.668617] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1646.670116] env[69648]: INFO nova.compute.claims [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1646.708998] env[69648]: DEBUG oslo_concurrency.lockutils [None req-353d5504-84b9-412e-86ae-3bbbe64da105 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "147e1f39-c2ae-410e-9b62-cd56b5978e1b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.177s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1646.877778] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-579a6bc0-1418-41e7-955d-ad82efd8ffb9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.886205] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e040f8cc-572c-44e5-8090-fc59620d5115 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.916631] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b1541f0-ac31-46e3-80c9-57446ea486ba {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.923327] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ff647e0-2934-49fe-9dff-90a9efb8a606 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.936363] env[69648]: DEBUG nova.compute.provider_tree [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1646.944802] env[69648]: DEBUG nova.scheduler.client.report [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1646.958363] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.290s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1646.958823] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1646.993838] env[69648]: DEBUG nova.compute.utils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1646.995604] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1646.995790] env[69648]: DEBUG nova.network.neutron [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1647.005871] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1647.078395] env[69648]: DEBUG nova.policy [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd02320b12288496eae0a735447321a7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '896367398859465488fc12205d122a4e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1647.081638] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1647.109021] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1647.109021] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1647.109021] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1647.109223] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1647.109223] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1647.109223] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1647.109223] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1647.109223] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1647.109400] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1647.109400] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1647.109400] env[69648]: DEBUG nova.virt.hardware [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1647.109498] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df85d1f6-b124-44d8-82f5-4f4e63d704e7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.117417] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e7123b7-97ef-45dc-a361-40031d682f1d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.410981] env[69648]: DEBUG nova.network.neutron [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Successfully created port: b5279022-7d32-472f-bb75-c93e5fe99351 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1648.154903] env[69648]: DEBUG nova.network.neutron [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Successfully updated port: b5279022-7d32-472f-bb75-c93e5fe99351 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1648.174383] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "refresh_cache-723972b1-3f91-4c59-b265-3975644dadb2" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1648.174383] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired lock "refresh_cache-723972b1-3f91-4c59-b265-3975644dadb2" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1648.174383] env[69648]: DEBUG nova.network.neutron [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1648.255173] env[69648]: DEBUG nova.network.neutron [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1648.584167] env[69648]: DEBUG nova.compute.manager [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Received event network-vif-plugged-b5279022-7d32-472f-bb75-c93e5fe99351 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1648.584424] env[69648]: DEBUG oslo_concurrency.lockutils [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] Acquiring lock "723972b1-3f91-4c59-b265-3975644dadb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1648.584635] env[69648]: DEBUG oslo_concurrency.lockutils [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] Lock "723972b1-3f91-4c59-b265-3975644dadb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1648.584805] env[69648]: DEBUG oslo_concurrency.lockutils [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] Lock "723972b1-3f91-4c59-b265-3975644dadb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1648.584980] env[69648]: DEBUG nova.compute.manager [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] No waiting events found dispatching network-vif-plugged-b5279022-7d32-472f-bb75-c93e5fe99351 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1648.585186] env[69648]: WARNING nova.compute.manager [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Received unexpected event network-vif-plugged-b5279022-7d32-472f-bb75-c93e5fe99351 for instance with vm_state building and task_state spawning. [ 1648.585350] env[69648]: DEBUG nova.compute.manager [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Received event network-changed-b5279022-7d32-472f-bb75-c93e5fe99351 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1648.585506] env[69648]: DEBUG nova.compute.manager [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Refreshing instance network info cache due to event network-changed-b5279022-7d32-472f-bb75-c93e5fe99351. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1648.585679] env[69648]: DEBUG oslo_concurrency.lockutils [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] Acquiring lock "refresh_cache-723972b1-3f91-4c59-b265-3975644dadb2" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1648.593425] env[69648]: DEBUG nova.network.neutron [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Updating instance_info_cache with network_info: [{"id": "b5279022-7d32-472f-bb75-c93e5fe99351", "address": "fa:16:3e:2b:66:4a", "network": {"id": "58c9db42-24fd-4615-9f91-977554db657a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-27923875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "896367398859465488fc12205d122a4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5279022-7d", "ovs_interfaceid": "b5279022-7d32-472f-bb75-c93e5fe99351", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1648.606178] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Releasing lock "refresh_cache-723972b1-3f91-4c59-b265-3975644dadb2" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1648.606971] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Instance network_info: |[{"id": "b5279022-7d32-472f-bb75-c93e5fe99351", "address": "fa:16:3e:2b:66:4a", "network": {"id": "58c9db42-24fd-4615-9f91-977554db657a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-27923875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "896367398859465488fc12205d122a4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5279022-7d", "ovs_interfaceid": "b5279022-7d32-472f-bb75-c93e5fe99351", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1648.607324] env[69648]: DEBUG oslo_concurrency.lockutils [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] Acquired lock "refresh_cache-723972b1-3f91-4c59-b265-3975644dadb2" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1648.607514] env[69648]: DEBUG nova.network.neutron [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Refreshing network info cache for port b5279022-7d32-472f-bb75-c93e5fe99351 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1648.609924] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2b:66:4a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '52f465cb-7418-4172-bd7d-aec00abeb692', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b5279022-7d32-472f-bb75-c93e5fe99351', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1648.622477] env[69648]: DEBUG oslo.service.loopingcall [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1648.624023] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1648.625728] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b9e95652-1494-440b-b44f-ad8affa89b2b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.650255] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1648.650255] env[69648]: value = "task-3466621" [ 1648.650255] env[69648]: _type = "Task" [ 1648.650255] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1648.660037] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466621, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1649.160657] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466621, 'name': CreateVM_Task, 'duration_secs': 0.273301} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1649.160924] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1649.161522] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1649.161693] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1649.162020] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1649.162576] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-261b0bd5-2a56-41f5-b5e4-002eeb98d814 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.169749] env[69648]: DEBUG oslo_vmware.api [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 1649.169749] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528e76e0-b00c-8603-a76f-563e309e6389" [ 1649.169749] env[69648]: _type = "Task" [ 1649.169749] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1649.185766] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1649.186025] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1649.186247] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1649.345019] env[69648]: DEBUG nova.network.neutron [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Updated VIF entry in instance network info cache for port b5279022-7d32-472f-bb75-c93e5fe99351. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1649.345111] env[69648]: DEBUG nova.network.neutron [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Updating instance_info_cache with network_info: [{"id": "b5279022-7d32-472f-bb75-c93e5fe99351", "address": "fa:16:3e:2b:66:4a", "network": {"id": "58c9db42-24fd-4615-9f91-977554db657a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-27923875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "896367398859465488fc12205d122a4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5279022-7d", "ovs_interfaceid": "b5279022-7d32-472f-bb75-c93e5fe99351", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1649.357099] env[69648]: DEBUG oslo_concurrency.lockutils [req-839a5560-b436-415a-9aa7-afb63b2ff133 req-3b172aec-b718-4023-b25f-832dded70033 service nova] Releasing lock "refresh_cache-723972b1-3f91-4c59-b265-3975644dadb2" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1670.458374] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Acquiring lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1670.458703] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1675.064514] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1677.065524] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1678.066446] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1680.066220] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1682.061391] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1683.065661] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1683.076728] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1683.076946] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1683.077139] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1683.077302] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1683.078427] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ad3b34d-a4a7-483f-b243-14bb0e1f5fcf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.087175] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47dd8530-65b4-471e-8207-36cdd6b0d61f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.101981] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe413ed7-8e48-4343-a586-d75211f264c5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.107915] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ffcd2e4-4afa-4cbd-b834-268cbdd6561d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.136144] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180946MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1683.136296] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1683.136488] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1683.213160] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance a924bdee-1e16-4d78-ac6b-9574677de55f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.213360] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.213488] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 58804be5-ee46-4b25-be84-890d5cd1607f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.213615] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.213737] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.213859] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.213978] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.214172] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.214228] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.214345] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 723972b1-3f91-4c59-b265-3975644dadb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1683.225446] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1683.235852] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1683.245714] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1683.255436] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3edf3b50-a4bf-4e75-927a-db78c433dbc4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1683.265558] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 31c84a7e-7a41-4d9f-ad29-6dad6648d85f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1683.265808] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1683.266150] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1683.435996] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e436b52-0ef1-4572-9d99-49c0222ba44c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.443825] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bc54b61-d5b8-447f-8bb0-40c8ce223002 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.474598] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d95ebc62-4cc9-440e-a2e4-c6d1e0a36ad6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.481961] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53ed09f7-1442-4ce8-a506-9fccda72c5cb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.494854] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1683.503219] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1683.517507] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1683.517655] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.381s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1684.517446] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1684.517724] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1685.066448] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1687.066193] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1687.066487] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1687.066487] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1687.085928] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.086096] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.086236] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.086367] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.086495] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.086617] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.086738] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.086859] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.086978] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.087109] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1687.087235] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1695.872054] env[69648]: WARNING oslo_vmware.rw_handles [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1695.872054] env[69648]: ERROR oslo_vmware.rw_handles [ 1695.872054] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/e0cf3896-a435-48a5-b398-d84f5f49161d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1695.873974] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1695.874282] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Copying Virtual Disk [datastore1] vmware_temp/e0cf3896-a435-48a5-b398-d84f5f49161d/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/e0cf3896-a435-48a5-b398-d84f5f49161d/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1695.874637] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a0f4542c-6669-49fb-ba82-ff0709ddbdca {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1695.883291] env[69648]: DEBUG oslo_vmware.api [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for the task: (returnval){ [ 1695.883291] env[69648]: value = "task-3466632" [ 1695.883291] env[69648]: _type = "Task" [ 1695.883291] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1695.891557] env[69648]: DEBUG oslo_vmware.api [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': task-3466632, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1696.394185] env[69648]: DEBUG oslo_vmware.exceptions [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1696.394492] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1696.395221] env[69648]: ERROR nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1696.395221] env[69648]: Faults: ['InvalidArgument'] [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Traceback (most recent call last): [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] yield resources [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] self.driver.spawn(context, instance, image_meta, [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] self._fetch_image_if_missing(context, vi) [ 1696.395221] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] image_cache(vi, tmp_image_ds_loc) [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] vm_util.copy_virtual_disk( [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] session._wait_for_task(vmdk_copy_task) [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] return self.wait_for_task(task_ref) [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] return evt.wait() [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] result = hub.switch() [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1696.395662] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] return self.greenlet.switch() [ 1696.396194] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1696.396194] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] self.f(*self.args, **self.kw) [ 1696.396194] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1696.396194] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] raise exceptions.translate_fault(task_info.error) [ 1696.396194] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1696.396194] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Faults: ['InvalidArgument'] [ 1696.396194] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] [ 1696.396194] env[69648]: INFO nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Terminating instance [ 1696.397167] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1696.397383] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1696.397698] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0f5169c6-2f7c-43e5-9640-8f231c6213b6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.400411] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1696.400628] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1696.401424] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-946c0fa6-a74b-4f4b-87fb-4ecd81658b83 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.408818] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1696.409980] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-940bb3ee-9988-4c89-a7f4-db966ce2aec0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.411544] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1696.411725] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1696.412431] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3ce02572-0e27-4e7f-aa9f-b9266c666a7b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.417879] env[69648]: DEBUG oslo_vmware.api [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 1696.417879] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52da2e04-055b-b24c-5017-65244d4ae344" [ 1696.417879] env[69648]: _type = "Task" [ 1696.417879] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1696.432811] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1696.433107] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating directory with path [datastore1] vmware_temp/7aeb7102-cb33-44f7-a5f2-f41f61fbbaf3/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1696.433364] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-db14ec13-c5d8-4d2a-bd7e-db8bac786bbd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.457071] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Created directory with path [datastore1] vmware_temp/7aeb7102-cb33-44f7-a5f2-f41f61fbbaf3/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1696.457306] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Fetch image to [datastore1] vmware_temp/7aeb7102-cb33-44f7-a5f2-f41f61fbbaf3/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1696.457520] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/7aeb7102-cb33-44f7-a5f2-f41f61fbbaf3/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1696.458367] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-044053fe-b236-46d6-9468-ae08645d0c09 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.465557] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e674858-c0ac-425e-ae88-9b2730e06613 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.476262] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d95d8d4-ffb3-4383-958b-4cb55d671e7b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.516249] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08b15538-9dd8-4f11-930f-46b9b343f182 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.518984] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1696.519198] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1696.519377] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Deleting the datastore file [datastore1] a924bdee-1e16-4d78-ac6b-9574677de55f {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1696.519622] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-db3921a1-51eb-4e8c-b37d-db1932b46507 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.524666] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5aa5ff2e-5844-485e-9c2e-dbab6816f0d4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.527594] env[69648]: DEBUG oslo_vmware.api [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for the task: (returnval){ [ 1696.527594] env[69648]: value = "task-3466634" [ 1696.527594] env[69648]: _type = "Task" [ 1696.527594] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1696.534843] env[69648]: DEBUG oslo_vmware.api [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': task-3466634, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1696.549665] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1696.603318] env[69648]: DEBUG oslo_vmware.rw_handles [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7aeb7102-cb33-44f7-a5f2-f41f61fbbaf3/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1696.662449] env[69648]: DEBUG oslo_vmware.rw_handles [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1696.662620] env[69648]: DEBUG oslo_vmware.rw_handles [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7aeb7102-cb33-44f7-a5f2-f41f61fbbaf3/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1697.039024] env[69648]: DEBUG oslo_vmware.api [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Task: {'id': task-3466634, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081433} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1697.039024] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1697.039024] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1697.039411] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1697.039411] env[69648]: INFO nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1697.041472] env[69648]: DEBUG nova.compute.claims [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1697.041649] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1697.041881] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1697.261923] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4e92324-f6b2-4f90-872b-e7c8803ae222 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.269334] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a702ce89-48f4-44f8-959a-22322b660c86 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.299491] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f7b8fa3-e224-406e-a929-73a6e051a508 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.305986] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea4e8148-f9c6-4f44-a3e3-87080864be6d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.318347] env[69648]: DEBUG nova.compute.provider_tree [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1697.326751] env[69648]: DEBUG nova.scheduler.client.report [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1697.340266] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.298s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1697.340773] env[69648]: ERROR nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1697.340773] env[69648]: Faults: ['InvalidArgument'] [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Traceback (most recent call last): [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] self.driver.spawn(context, instance, image_meta, [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] self._fetch_image_if_missing(context, vi) [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] image_cache(vi, tmp_image_ds_loc) [ 1697.340773] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] vm_util.copy_virtual_disk( [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] session._wait_for_task(vmdk_copy_task) [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] return self.wait_for_task(task_ref) [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] return evt.wait() [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] result = hub.switch() [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] return self.greenlet.switch() [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1697.341153] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] self.f(*self.args, **self.kw) [ 1697.341606] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1697.341606] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] raise exceptions.translate_fault(task_info.error) [ 1697.341606] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1697.341606] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Faults: ['InvalidArgument'] [ 1697.341606] env[69648]: ERROR nova.compute.manager [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] [ 1697.341606] env[69648]: DEBUG nova.compute.utils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1697.342706] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Build of instance a924bdee-1e16-4d78-ac6b-9574677de55f was re-scheduled: A specified parameter was not correct: fileType [ 1697.342706] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1697.343080] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1697.343257] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1697.343414] env[69648]: DEBUG nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1697.343575] env[69648]: DEBUG nova.network.neutron [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1697.876906] env[69648]: DEBUG nova.network.neutron [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1697.888147] env[69648]: INFO nova.compute.manager [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Took 0.54 seconds to deallocate network for instance. [ 1697.984280] env[69648]: INFO nova.scheduler.client.report [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Deleted allocations for instance a924bdee-1e16-4d78-ac6b-9574677de55f [ 1698.007793] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c377d93e-28c7-48f7-8787-019615a06830 tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "a924bdee-1e16-4d78-ac6b-9574677de55f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 569.544s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1698.009152] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "a924bdee-1e16-4d78-ac6b-9574677de55f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 373.060s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1698.009400] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Acquiring lock "a924bdee-1e16-4d78-ac6b-9574677de55f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1698.009624] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "a924bdee-1e16-4d78-ac6b-9574677de55f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1698.009808] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "a924bdee-1e16-4d78-ac6b-9574677de55f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1698.012860] env[69648]: INFO nova.compute.manager [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Terminating instance [ 1698.014546] env[69648]: DEBUG nova.compute.manager [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1698.014871] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1698.015394] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5c7f28f9-8129-4b3b-b561-5214ce56df7f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.025629] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee4558f4-8b07-4ab9-b1e3-6d93335ff63e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.054723] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a924bdee-1e16-4d78-ac6b-9574677de55f could not be found. [ 1698.055013] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1698.055127] env[69648]: INFO nova.compute.manager [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1698.055392] env[69648]: DEBUG oslo.service.loopingcall [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1698.055724] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1698.058020] env[69648]: DEBUG nova.compute.manager [-] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1698.058127] env[69648]: DEBUG nova.network.neutron [-] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1698.085529] env[69648]: DEBUG nova.network.neutron [-] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1698.101038] env[69648]: INFO nova.compute.manager [-] [instance: a924bdee-1e16-4d78-ac6b-9574677de55f] Took 0.04 seconds to deallocate network for instance. [ 1698.113341] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1698.113626] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1698.115714] env[69648]: INFO nova.compute.claims [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1698.190960] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41237612-780a-4cd9-a44a-d58a9bcd417f tempest-ListImageFiltersTestJSON-2079017196 tempest-ListImageFiltersTestJSON-2079017196-project-member] Lock "a924bdee-1e16-4d78-ac6b-9574677de55f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1698.335227] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-052148f3-75af-44c5-bd29-4b756c7e0076 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.343036] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed7cf78f-0f61-4cfa-bb69-a564f0473b3e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.372940] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7e114b0-b08d-4db1-86c4-a25ee7e94f69 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.380413] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab3bf09a-a076-4076-99ac-6d4cf1eb9dff {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.393572] env[69648]: DEBUG nova.compute.provider_tree [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1698.402994] env[69648]: DEBUG nova.scheduler.client.report [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1698.415741] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1698.416235] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1698.449485] env[69648]: DEBUG nova.compute.utils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1698.451370] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1698.451605] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1698.459648] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1698.506371] env[69648]: DEBUG nova.policy [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6e63e22c15c457abb91ad9f4cde2983', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c24c07422cdb4ae193a0ad8fde391d7a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1698.526478] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1698.551376] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1698.551620] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1698.551773] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1698.551959] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1698.552123] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1698.552273] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1698.552484] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1698.552647] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1698.552813] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1698.553014] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1698.553267] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1698.554148] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b712de0-9148-4eb0-9f87-3464e562e62f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.561886] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e855bc09-4d64-4048-8982-8bc6c1e8f5bc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.899951] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Successfully created port: 30edd0ca-0b10-4d07-ba8b-73174a76fdba {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1699.551173] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Successfully updated port: 30edd0ca-0b10-4d07-ba8b-73174a76fdba {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1699.561855] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "refresh_cache-18745ec2-477d-427d-b2dd-997f73d9fd53" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1699.562050] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired lock "refresh_cache-18745ec2-477d-427d-b2dd-997f73d9fd53" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1699.562208] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1699.597916] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1699.751684] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Updating instance_info_cache with network_info: [{"id": "30edd0ca-0b10-4d07-ba8b-73174a76fdba", "address": "fa:16:3e:33:79:bf", "network": {"id": "130e81f4-2301-4499-b916-449ad32b9389", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-868563699-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c24c07422cdb4ae193a0ad8fde391d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30edd0ca-0b", "ovs_interfaceid": "30edd0ca-0b10-4d07-ba8b-73174a76fdba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1699.763010] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Releasing lock "refresh_cache-18745ec2-477d-427d-b2dd-997f73d9fd53" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1699.763311] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Instance network_info: |[{"id": "30edd0ca-0b10-4d07-ba8b-73174a76fdba", "address": "fa:16:3e:33:79:bf", "network": {"id": "130e81f4-2301-4499-b916-449ad32b9389", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-868563699-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c24c07422cdb4ae193a0ad8fde391d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30edd0ca-0b", "ovs_interfaceid": "30edd0ca-0b10-4d07-ba8b-73174a76fdba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1699.763694] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:33:79:bf', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f54f7284-8f7d-47ee-839d-2143062cfe44', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '30edd0ca-0b10-4d07-ba8b-73174a76fdba', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1699.772009] env[69648]: DEBUG oslo.service.loopingcall [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1699.772481] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1699.772743] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5dd4b375-682a-41a6-ac46-c5f9da7906ec {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.793090] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1699.793090] env[69648]: value = "task-3466635" [ 1699.793090] env[69648]: _type = "Task" [ 1699.793090] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1699.804094] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466635, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1699.932408] env[69648]: DEBUG nova.compute.manager [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Received event network-vif-plugged-30edd0ca-0b10-4d07-ba8b-73174a76fdba {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1699.932408] env[69648]: DEBUG oslo_concurrency.lockutils [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] Acquiring lock "18745ec2-477d-427d-b2dd-997f73d9fd53-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1699.932408] env[69648]: DEBUG oslo_concurrency.lockutils [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1699.932408] env[69648]: DEBUG oslo_concurrency.lockutils [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1699.932835] env[69648]: DEBUG nova.compute.manager [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] No waiting events found dispatching network-vif-plugged-30edd0ca-0b10-4d07-ba8b-73174a76fdba {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1699.932835] env[69648]: WARNING nova.compute.manager [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Received unexpected event network-vif-plugged-30edd0ca-0b10-4d07-ba8b-73174a76fdba for instance with vm_state building and task_state spawning. [ 1699.932835] env[69648]: DEBUG nova.compute.manager [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Received event network-changed-30edd0ca-0b10-4d07-ba8b-73174a76fdba {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1699.932835] env[69648]: DEBUG nova.compute.manager [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Refreshing instance network info cache due to event network-changed-30edd0ca-0b10-4d07-ba8b-73174a76fdba. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1699.933073] env[69648]: DEBUG oslo_concurrency.lockutils [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] Acquiring lock "refresh_cache-18745ec2-477d-427d-b2dd-997f73d9fd53" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1699.933136] env[69648]: DEBUG oslo_concurrency.lockutils [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] Acquired lock "refresh_cache-18745ec2-477d-427d-b2dd-997f73d9fd53" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1699.933272] env[69648]: DEBUG nova.network.neutron [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Refreshing network info cache for port 30edd0ca-0b10-4d07-ba8b-73174a76fdba {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1700.214626] env[69648]: DEBUG nova.network.neutron [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Updated VIF entry in instance network info cache for port 30edd0ca-0b10-4d07-ba8b-73174a76fdba. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1700.215028] env[69648]: DEBUG nova.network.neutron [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Updating instance_info_cache with network_info: [{"id": "30edd0ca-0b10-4d07-ba8b-73174a76fdba", "address": "fa:16:3e:33:79:bf", "network": {"id": "130e81f4-2301-4499-b916-449ad32b9389", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-868563699-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c24c07422cdb4ae193a0ad8fde391d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30edd0ca-0b", "ovs_interfaceid": "30edd0ca-0b10-4d07-ba8b-73174a76fdba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1700.224520] env[69648]: DEBUG oslo_concurrency.lockutils [req-ac7529cb-95f0-4330-9eb4-53f3cb9f399c req-a463e8d3-b755-4c86-ae5d-950188d594ca service nova] Releasing lock "refresh_cache-18745ec2-477d-427d-b2dd-997f73d9fd53" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1700.302984] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466635, 'name': CreateVM_Task, 'duration_secs': 0.310849} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1700.304020] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1700.304020] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1700.304189] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1700.304487] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1700.304735] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-418118dc-eb30-482e-8458-a00bae4a57ae {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.309183] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 1700.309183] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f20880-1019-42a2-30eb-7f9be0ee6332" [ 1700.309183] env[69648]: _type = "Task" [ 1700.309183] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1700.316586] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f20880-1019-42a2-30eb-7f9be0ee6332, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1700.820146] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1700.820527] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1700.820617] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1737.065560] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.090042] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1738.064903] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1738.065180] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1741.065776] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1743.061191] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1744.065372] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1744.078361] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1744.078588] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1744.078761] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1744.078965] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1744.080100] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36b8a1cf-956f-4c98-8064-7b7c999891aa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.088871] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b4eb3f8-4fca-49fd-beb5-4f79e11bbbe3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.102803] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d78e7e8a-9a86-461c-9f6e-b368382a8e2c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.109156] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a49775cb-83a7-4725-8162-81829d29f460 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.138125] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180933MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1744.138348] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1744.138561] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1744.280747] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.280923] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 58804be5-ee46-4b25-be84-890d5cd1607f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.281072] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.281203] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.281327] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.281451] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.281574] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.281696] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.281817] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 723972b1-3f91-4c59-b265-3975644dadb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.281965] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1744.292947] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1744.302663] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1744.312511] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3edf3b50-a4bf-4e75-927a-db78c433dbc4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1744.321839] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 31c84a7e-7a41-4d9f-ad29-6dad6648d85f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1744.322072] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1744.322226] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1744.337211] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing inventories for resource provider d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1744.350361] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating ProviderTree inventory for provider d38a352b-7808-44da-8216-792e96aadc88 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1744.350535] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating inventory in ProviderTree for provider d38a352b-7808-44da-8216-792e96aadc88 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1744.360771] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing aggregate associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, aggregates: None {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1744.377278] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing trait associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1744.531148] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4275fa5e-e711-4264-af90-05eb3aa26b3f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.539326] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95c5dfd8-aebe-4769-8f88-3a7ba8b78670 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.568653] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-379e1d58-294d-425e-b056-2e9323b8e1db {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.575878] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78055752-6289-4283-84b9-1292fe0c7556 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.588964] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1744.597892] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1744.618324] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1744.618521] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.480s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1745.618645] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1745.619017] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1745.887592] env[69648]: WARNING oslo_vmware.rw_handles [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1745.887592] env[69648]: ERROR oslo_vmware.rw_handles [ 1745.888083] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/7aeb7102-cb33-44f7-a5f2-f41f61fbbaf3/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1745.890072] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1745.890341] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Copying Virtual Disk [datastore1] vmware_temp/7aeb7102-cb33-44f7-a5f2-f41f61fbbaf3/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/7aeb7102-cb33-44f7-a5f2-f41f61fbbaf3/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1745.890638] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fca15458-f819-4723-a2d8-42db67c1bc7c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.898268] env[69648]: DEBUG oslo_vmware.api [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 1745.898268] env[69648]: value = "task-3466636" [ 1745.898268] env[69648]: _type = "Task" [ 1745.898268] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1745.905934] env[69648]: DEBUG oslo_vmware.api [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': task-3466636, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1746.065761] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1746.409308] env[69648]: DEBUG oslo_vmware.exceptions [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1746.409551] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1746.410105] env[69648]: ERROR nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1746.410105] env[69648]: Faults: ['InvalidArgument'] [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Traceback (most recent call last): [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] yield resources [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] self.driver.spawn(context, instance, image_meta, [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] self._fetch_image_if_missing(context, vi) [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1746.410105] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] image_cache(vi, tmp_image_ds_loc) [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] vm_util.copy_virtual_disk( [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] session._wait_for_task(vmdk_copy_task) [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] return self.wait_for_task(task_ref) [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] return evt.wait() [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] result = hub.switch() [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] return self.greenlet.switch() [ 1746.410547] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1746.410961] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] self.f(*self.args, **self.kw) [ 1746.410961] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1746.410961] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] raise exceptions.translate_fault(task_info.error) [ 1746.410961] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1746.410961] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Faults: ['InvalidArgument'] [ 1746.410961] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] [ 1746.410961] env[69648]: INFO nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Terminating instance [ 1746.411944] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1746.412172] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1746.412401] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0ef44bfa-81a0-4188-9196-9975dde3ef23 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.415786] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1746.415974] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1746.416679] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-887692bc-2343-48c4-bc78-fc6497e1eb0f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.420013] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1746.420214] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1746.421143] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8fd13c24-5711-4c1e-b83b-b539e4014e8c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.424851] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1746.425374] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-81eac1ef-9453-4b00-af46-79320a5ba38b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.427585] env[69648]: DEBUG oslo_vmware.api [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 1746.427585] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52ca55d6-432f-7a94-0575-5f8ec93c2302" [ 1746.427585] env[69648]: _type = "Task" [ 1746.427585] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1746.434651] env[69648]: DEBUG oslo_vmware.api [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52ca55d6-432f-7a94-0575-5f8ec93c2302, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1746.496443] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1746.496652] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1746.496842] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Deleting the datastore file [datastore1] ba0b4adc-fa4a-4b36-bb86-58ff038c834e {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1746.497125] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4faa3cf4-4c47-47c2-a109-0bad814f16db {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.503561] env[69648]: DEBUG oslo_vmware.api [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 1746.503561] env[69648]: value = "task-3466638" [ 1746.503561] env[69648]: _type = "Task" [ 1746.503561] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1746.510885] env[69648]: DEBUG oslo_vmware.api [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': task-3466638, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1746.938831] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1746.939187] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating directory with path [datastore1] vmware_temp/98099ce1-9ccd-43ed-a2b9-c0acbe5c08c7/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1746.939319] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a513b08c-ea69-425c-b152-056a258c88bf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.951388] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Created directory with path [datastore1] vmware_temp/98099ce1-9ccd-43ed-a2b9-c0acbe5c08c7/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1746.951572] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Fetch image to [datastore1] vmware_temp/98099ce1-9ccd-43ed-a2b9-c0acbe5c08c7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1746.951743] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/98099ce1-9ccd-43ed-a2b9-c0acbe5c08c7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1746.952444] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1218bb0-5e2d-4a35-b33e-8b9c0e0ff486 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.958691] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74e35316-74a7-4a2e-8552-b23486a2bd9d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.967287] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-571a0a31-f467-4cef-984d-b97983d4e995 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.998406] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efd30e35-78b6-4dc9-8bcb-9e27c1732899 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.007624] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6c434f14-1e03-40b6-a9af-362b3c1cbbf5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.014292] env[69648]: DEBUG oslo_vmware.api [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': task-3466638, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081666} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1747.014513] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1747.014686] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1747.014854] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1747.015038] env[69648]: INFO nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1747.017156] env[69648]: DEBUG nova.compute.claims [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1747.017329] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1747.017542] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1747.034842] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1747.086999] env[69648]: DEBUG oslo_vmware.rw_handles [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/98099ce1-9ccd-43ed-a2b9-c0acbe5c08c7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1747.146446] env[69648]: DEBUG oslo_vmware.rw_handles [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1747.146644] env[69648]: DEBUG oslo_vmware.rw_handles [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/98099ce1-9ccd-43ed-a2b9-c0acbe5c08c7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1747.260734] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b13bdd25-612e-4caa-b32c-3b4312b33c7f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.267897] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e365d20a-cd53-4fe4-9624-84a69510bc94 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.297184] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e19ed00-267a-485a-866d-0ae3157f994f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.303813] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f43d28ae-cf7a-44dd-885a-166c1d6312b5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.316343] env[69648]: DEBUG nova.compute.provider_tree [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1747.324804] env[69648]: DEBUG nova.scheduler.client.report [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1747.337516] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.320s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.338083] env[69648]: ERROR nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1747.338083] env[69648]: Faults: ['InvalidArgument'] [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Traceback (most recent call last): [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] self.driver.spawn(context, instance, image_meta, [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] self._fetch_image_if_missing(context, vi) [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] image_cache(vi, tmp_image_ds_loc) [ 1747.338083] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] vm_util.copy_virtual_disk( [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] session._wait_for_task(vmdk_copy_task) [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] return self.wait_for_task(task_ref) [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] return evt.wait() [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] result = hub.switch() [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] return self.greenlet.switch() [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1747.338468] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] self.f(*self.args, **self.kw) [ 1747.338845] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1747.338845] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] raise exceptions.translate_fault(task_info.error) [ 1747.338845] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1747.338845] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Faults: ['InvalidArgument'] [ 1747.338845] env[69648]: ERROR nova.compute.manager [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] [ 1747.338845] env[69648]: DEBUG nova.compute.utils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1747.340147] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Build of instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e was re-scheduled: A specified parameter was not correct: fileType [ 1747.340147] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1747.340547] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1747.340724] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1747.340898] env[69648]: DEBUG nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1747.341127] env[69648]: DEBUG nova.network.neutron [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1747.640034] env[69648]: DEBUG nova.network.neutron [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1747.653241] env[69648]: INFO nova.compute.manager [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Took 0.31 seconds to deallocate network for instance. [ 1747.752541] env[69648]: INFO nova.scheduler.client.report [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Deleted allocations for instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e [ 1747.776814] env[69648]: DEBUG oslo_concurrency.lockutils [None req-031709af-6371-46d5-b409-7af63f014567 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 557.363s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.776814] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 361.181s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1747.776814] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1747.776814] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1747.777084] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.778573] env[69648]: INFO nova.compute.manager [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Terminating instance [ 1747.780288] env[69648]: DEBUG nova.compute.manager [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1747.780492] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1747.780953] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a195ac0e-e8b6-499a-a285-64644fd8af19 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.792101] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660d1838-b03d-4995-9914-07154e7800fa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.805081] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1747.827052] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ba0b4adc-fa4a-4b36-bb86-58ff038c834e could not be found. [ 1747.827283] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1747.827461] env[69648]: INFO nova.compute.manager [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1747.827708] env[69648]: DEBUG oslo.service.loopingcall [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1747.827982] env[69648]: DEBUG nova.compute.manager [-] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1747.828102] env[69648]: DEBUG nova.network.neutron [-] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1747.860762] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1747.860762] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1747.860762] env[69648]: INFO nova.compute.claims [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1747.865054] env[69648]: DEBUG nova.network.neutron [-] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1747.875194] env[69648]: INFO nova.compute.manager [-] [instance: ba0b4adc-fa4a-4b36-bb86-58ff038c834e] Took 0.05 seconds to deallocate network for instance. [ 1747.977624] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4edc4d5f-237a-4b64-92f7-45b7eb3223f7 tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "ba0b4adc-fa4a-4b36-bb86-58ff038c834e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.201s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1748.067059] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1748.067171] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1748.067251] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1748.098032] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.098145] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.098270] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.098445] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.098597] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.098801] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.099036] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.099262] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.099432] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.100058] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1748.100058] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1748.100737] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1748.100964] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 1748.104995] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbfb3faf-df59-4fd9-93dd-d365bec31e64 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.109890] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] There are 0 instances to clean {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 1748.115539] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c351e8b-5831-4a68-9a72-9767b254b574 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.147333] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0a7a19f-9994-447c-b7b6-92fa62e65af6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.155242] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84702d8c-084d-4236-b851-c69f089e3ae4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.169184] env[69648]: DEBUG nova.compute.provider_tree [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1748.177193] env[69648]: DEBUG nova.scheduler.client.report [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1748.192433] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.333s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1748.192933] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1748.231871] env[69648]: DEBUG nova.compute.utils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1748.233205] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1748.233397] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1748.247926] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1748.297355] env[69648]: DEBUG nova.policy [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6e63e22c15c457abb91ad9f4cde2983', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c24c07422cdb4ae193a0ad8fde391d7a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1748.312339] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1748.338844] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1748.339484] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1748.339484] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1748.339608] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1748.339652] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1748.339807] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1748.340045] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1748.340202] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1748.340376] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1748.340546] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1748.340725] env[69648]: DEBUG nova.virt.hardware [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1748.341625] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab1a3e9e-3f7c-491b-b8cc-614b2cc1d2f7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.350654] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b453eb2-86df-4345-8c0a-58dd986e245d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.666457] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Successfully created port: 651f889a-25c7-43e0-b918-a0ca4bd671df {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1749.580058] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Successfully updated port: 651f889a-25c7-43e0-b918-a0ca4bd671df {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1749.591526] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "refresh_cache-3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1749.591644] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired lock "refresh_cache-3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1749.591800] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1749.628789] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1749.787038] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Updating instance_info_cache with network_info: [{"id": "651f889a-25c7-43e0-b918-a0ca4bd671df", "address": "fa:16:3e:69:91:ce", "network": {"id": "130e81f4-2301-4499-b916-449ad32b9389", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-868563699-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c24c07422cdb4ae193a0ad8fde391d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap651f889a-25", "ovs_interfaceid": "651f889a-25c7-43e0-b918-a0ca4bd671df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1749.798901] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Releasing lock "refresh_cache-3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1749.799322] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Instance network_info: |[{"id": "651f889a-25c7-43e0-b918-a0ca4bd671df", "address": "fa:16:3e:69:91:ce", "network": {"id": "130e81f4-2301-4499-b916-449ad32b9389", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-868563699-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c24c07422cdb4ae193a0ad8fde391d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap651f889a-25", "ovs_interfaceid": "651f889a-25c7-43e0-b918-a0ca4bd671df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1749.799742] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:69:91:ce', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f54f7284-8f7d-47ee-839d-2143062cfe44', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '651f889a-25c7-43e0-b918-a0ca4bd671df', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1749.807472] env[69648]: DEBUG oslo.service.loopingcall [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1749.808012] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1749.808313] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ed17a36a-cd57-4f68-afdc-219bed2c07b4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.829310] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1749.829310] env[69648]: value = "task-3466639" [ 1749.829310] env[69648]: _type = "Task" [ 1749.829310] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1749.838305] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466639, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1749.879842] env[69648]: DEBUG nova.compute.manager [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Received event network-vif-plugged-651f889a-25c7-43e0-b918-a0ca4bd671df {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1749.880291] env[69648]: DEBUG oslo_concurrency.lockutils [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] Acquiring lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1749.880650] env[69648]: DEBUG oslo_concurrency.lockutils [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1749.880876] env[69648]: DEBUG oslo_concurrency.lockutils [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.881171] env[69648]: DEBUG nova.compute.manager [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] No waiting events found dispatching network-vif-plugged-651f889a-25c7-43e0-b918-a0ca4bd671df {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1749.881449] env[69648]: WARNING nova.compute.manager [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Received unexpected event network-vif-plugged-651f889a-25c7-43e0-b918-a0ca4bd671df for instance with vm_state building and task_state spawning. [ 1749.881749] env[69648]: DEBUG nova.compute.manager [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Received event network-changed-651f889a-25c7-43e0-b918-a0ca4bd671df {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1749.882095] env[69648]: DEBUG nova.compute.manager [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Refreshing instance network info cache due to event network-changed-651f889a-25c7-43e0-b918-a0ca4bd671df. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1749.882421] env[69648]: DEBUG oslo_concurrency.lockutils [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] Acquiring lock "refresh_cache-3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1749.882698] env[69648]: DEBUG oslo_concurrency.lockutils [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] Acquired lock "refresh_cache-3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1749.882999] env[69648]: DEBUG nova.network.neutron [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Refreshing network info cache for port 651f889a-25c7-43e0-b918-a0ca4bd671df {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1750.065040] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1750.065224] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances with incomplete migration {{(pid=69648) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 1750.152045] env[69648]: DEBUG nova.network.neutron [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Updated VIF entry in instance network info cache for port 651f889a-25c7-43e0-b918-a0ca4bd671df. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1750.152417] env[69648]: DEBUG nova.network.neutron [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Updating instance_info_cache with network_info: [{"id": "651f889a-25c7-43e0-b918-a0ca4bd671df", "address": "fa:16:3e:69:91:ce", "network": {"id": "130e81f4-2301-4499-b916-449ad32b9389", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-868563699-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c24c07422cdb4ae193a0ad8fde391d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f54f7284-8f7d-47ee-839d-2143062cfe44", "external-id": "nsx-vlan-transportzone-569", "segmentation_id": 569, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap651f889a-25", "ovs_interfaceid": "651f889a-25c7-43e0-b918-a0ca4bd671df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1750.162692] env[69648]: DEBUG oslo_concurrency.lockutils [req-06c0313c-9339-474b-a16a-85128cc9fe25 req-38440401-57fe-4a61-a62e-7e410f25ff2a service nova] Releasing lock "refresh_cache-3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1750.339720] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466639, 'name': CreateVM_Task, 'duration_secs': 0.30403} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1750.339934] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1750.340739] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1750.340919] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1750.341261] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1750.341525] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a5ecff4e-29da-42df-998e-050a77d43ade {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.346092] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 1750.346092] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f67940-3f50-8cc0-c9a8-fa59017d9bc2" [ 1750.346092] env[69648]: _type = "Task" [ 1750.346092] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1750.353685] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f67940-3f50-8cc0-c9a8-fa59017d9bc2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1750.855987] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1750.856293] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1750.856481] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1751.065805] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1778.290568] env[69648]: DEBUG oslo_concurrency.lockutils [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "723972b1-3f91-4c59-b265-3975644dadb2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1783.598748] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1783.599072] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1791.675789] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1791.736574] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "18745ec2-477d-427d-b2dd-997f73d9fd53" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1793.855997] env[69648]: WARNING oslo_vmware.rw_handles [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1793.855997] env[69648]: ERROR oslo_vmware.rw_handles [ 1793.855997] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/98099ce1-9ccd-43ed-a2b9-c0acbe5c08c7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1793.857952] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1793.858233] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Copying Virtual Disk [datastore1] vmware_temp/98099ce1-9ccd-43ed-a2b9-c0acbe5c08c7/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/98099ce1-9ccd-43ed-a2b9-c0acbe5c08c7/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1793.858593] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b417a1fa-5663-402f-9bd5-ad0abb539ae5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.866902] env[69648]: DEBUG oslo_vmware.api [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 1793.866902] env[69648]: value = "task-3466640" [ 1793.866902] env[69648]: _type = "Task" [ 1793.866902] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1793.874829] env[69648]: DEBUG oslo_vmware.api [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': task-3466640, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1794.377504] env[69648]: DEBUG oslo_vmware.exceptions [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1794.377875] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1794.378375] env[69648]: ERROR nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1794.378375] env[69648]: Faults: ['InvalidArgument'] [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Traceback (most recent call last): [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] yield resources [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] self.driver.spawn(context, instance, image_meta, [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] self._fetch_image_if_missing(context, vi) [ 1794.378375] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] image_cache(vi, tmp_image_ds_loc) [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] vm_util.copy_virtual_disk( [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] session._wait_for_task(vmdk_copy_task) [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] return self.wait_for_task(task_ref) [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] return evt.wait() [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] result = hub.switch() [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1794.378735] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] return self.greenlet.switch() [ 1794.379449] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1794.379449] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] self.f(*self.args, **self.kw) [ 1794.379449] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1794.379449] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] raise exceptions.translate_fault(task_info.error) [ 1794.379449] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1794.379449] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Faults: ['InvalidArgument'] [ 1794.379449] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] [ 1794.379449] env[69648]: INFO nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Terminating instance [ 1794.380283] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1794.380495] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1794.381127] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1794.381320] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1794.381549] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f7225478-8172-4b1a-bc01-af40b4e88e8e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.384042] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6654ffbd-69b2-469c-b049-175e8a589ac9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.390784] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1794.391040] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-020ce75a-fa86-4875-b8db-e6e2cc320519 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.393297] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1794.393476] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1794.394437] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-81115429-655b-431a-8c01-73d0d2df382f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.399205] env[69648]: DEBUG oslo_vmware.api [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Waiting for the task: (returnval){ [ 1794.399205] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52352915-4bbb-2aa5-9cb6-6b6c7fe29fd7" [ 1794.399205] env[69648]: _type = "Task" [ 1794.399205] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1794.406388] env[69648]: DEBUG oslo_vmware.api [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52352915-4bbb-2aa5-9cb6-6b6c7fe29fd7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1794.410965] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1794.411204] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1794.457643] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1794.457850] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1794.457993] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Deleting the datastore file [datastore1] 58804be5-ee46-4b25-be84-890d5cd1607f {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1794.458332] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6da33a9c-14db-4f69-85d4-56954cbb3370 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.464109] env[69648]: DEBUG oslo_vmware.api [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 1794.464109] env[69648]: value = "task-3466642" [ 1794.464109] env[69648]: _type = "Task" [ 1794.464109] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1794.472054] env[69648]: DEBUG oslo_vmware.api [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': task-3466642, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1794.909585] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1794.909930] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Creating directory with path [datastore1] vmware_temp/2ea6ec61-42a1-41e0-b23a-4ccab2fad202/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1794.910161] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5db35162-04fa-4151-929d-e45bcf328fb0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.921398] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Created directory with path [datastore1] vmware_temp/2ea6ec61-42a1-41e0-b23a-4ccab2fad202/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1794.921594] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Fetch image to [datastore1] vmware_temp/2ea6ec61-42a1-41e0-b23a-4ccab2fad202/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1794.921765] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/2ea6ec61-42a1-41e0-b23a-4ccab2fad202/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1794.922484] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01385996-22fa-4221-8578-72971a5513f1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.930383] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ddc82c9-8e1d-4c84-9f66-869b23d93497 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.939296] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10f56768-e9d7-48d0-853d-b35de07cb519 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.969360] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_power_states {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1794.974976] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36995d99-10ba-400e-9f12-fe6e2f899407 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.983794] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6ccbdf5f-7c17-409d-9613-793f310b6e82 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.985504] env[69648]: DEBUG oslo_vmware.api [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': task-3466642, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076835} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1794.985714] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1794.985890] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1794.986075] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1794.986277] env[69648]: INFO nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1794.990775] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Getting list of instances from cluster (obj){ [ 1794.990775] env[69648]: value = "domain-c8" [ 1794.990775] env[69648]: _type = "ClusterComputeResource" [ 1794.990775] env[69648]: } {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1794.991282] env[69648]: DEBUG nova.compute.claims [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1794.991453] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1794.991741] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1794.995049] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf642856-7cab-4ca6-b760-1fd19db56a9a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.010983] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Got total of 9 instances {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1795.011189] env[69648]: WARNING nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] While synchronizing instance power states, found 10 instances in the database and 9 instances on the hypervisor. [ 1795.011341] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 58804be5-ee46-4b25-be84-890d5cd1607f {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.011556] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid c97308be-406b-4fd0-b502-69e8c800773f {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.011719] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 590dbeb2-7e21-454f-93b5-97065c5bfdb0 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.011875] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 3dc3db1c-43c0-45e9-8283-38e77f66f06f {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.012045] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 114bdafc-21f6-4a77-bf19-a444cbd8806c {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.012210] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.012365] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid bde8a72e-0ed5-4794-badf-0bc54c4c408b {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.012520] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 723972b1-3f91-4c59-b265-3975644dadb2 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.012743] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 18745ec2-477d-427d-b2dd-997f73d9fd53 {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.012949] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Triggering sync for uuid 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd {{(pid=69648) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1795.014639] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "58804be5-ee46-4b25-be84-890d5cd1607f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.014995] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "c97308be-406b-4fd0-b502-69e8c800773f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.015174] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.015339] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.015522] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "114bdafc-21f6-4a77-bf19-a444cbd8806c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.015732] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.015929] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.016143] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "723972b1-3f91-4c59-b265-3975644dadb2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.016367] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "18745ec2-477d-427d-b2dd-997f73d9fd53" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.016581] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.017108] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1795.073329] env[69648]: DEBUG oslo_vmware.rw_handles [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2ea6ec61-42a1-41e0-b23a-4ccab2fad202/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1795.134085] env[69648]: DEBUG oslo_vmware.rw_handles [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1795.134324] env[69648]: DEBUG oslo_vmware.rw_handles [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2ea6ec61-42a1-41e0-b23a-4ccab2fad202/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1795.257194] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af01b6fc-6ecc-4529-bb3f-2c7704973a4d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.264786] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4718d73a-b9e9-49e6-944c-a4973f2d0b13 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.294553] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29528626-a374-4262-9530-bb73ec0fcbeb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.301844] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6232fcb6-69be-49bd-a249-5f4bc6271bd2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.316148] env[69648]: DEBUG nova.compute.provider_tree [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1795.324058] env[69648]: DEBUG nova.scheduler.client.report [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1795.339454] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.348s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1795.340023] env[69648]: ERROR nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1795.340023] env[69648]: Faults: ['InvalidArgument'] [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Traceback (most recent call last): [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] self.driver.spawn(context, instance, image_meta, [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] self._fetch_image_if_missing(context, vi) [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] image_cache(vi, tmp_image_ds_loc) [ 1795.340023] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] vm_util.copy_virtual_disk( [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] session._wait_for_task(vmdk_copy_task) [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] return self.wait_for_task(task_ref) [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] return evt.wait() [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] result = hub.switch() [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] return self.greenlet.switch() [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1795.340591] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] self.f(*self.args, **self.kw) [ 1795.341076] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1795.341076] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] raise exceptions.translate_fault(task_info.error) [ 1795.341076] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1795.341076] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Faults: ['InvalidArgument'] [ 1795.341076] env[69648]: ERROR nova.compute.manager [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] [ 1795.341076] env[69648]: DEBUG nova.compute.utils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1795.342234] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Build of instance 58804be5-ee46-4b25-be84-890d5cd1607f was re-scheduled: A specified parameter was not correct: fileType [ 1795.342234] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1795.342621] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1795.343064] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1795.343064] env[69648]: DEBUG nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1795.343265] env[69648]: DEBUG nova.network.neutron [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1795.620449] env[69648]: DEBUG nova.network.neutron [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1795.630361] env[69648]: INFO nova.compute.manager [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Took 0.29 seconds to deallocate network for instance. [ 1795.722456] env[69648]: INFO nova.scheduler.client.report [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Deleted allocations for instance 58804be5-ee46-4b25-be84-890d5cd1607f [ 1795.743025] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c5affff7-7faf-40ae-8f73-087eb3e9c940 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "58804be5-ee46-4b25-be84-890d5cd1607f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 593.893s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1795.743799] env[69648]: DEBUG oslo_concurrency.lockutils [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "58804be5-ee46-4b25-be84-890d5cd1607f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 397.376s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1795.744127] env[69648]: DEBUG oslo_concurrency.lockutils [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "58804be5-ee46-4b25-be84-890d5cd1607f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.744365] env[69648]: DEBUG oslo_concurrency.lockutils [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "58804be5-ee46-4b25-be84-890d5cd1607f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1795.744541] env[69648]: DEBUG oslo_concurrency.lockutils [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "58804be5-ee46-4b25-be84-890d5cd1607f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1795.746817] env[69648]: INFO nova.compute.manager [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Terminating instance [ 1795.748637] env[69648]: DEBUG nova.compute.manager [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1795.749057] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1795.749602] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-308dc510-e56d-4638-a7a2-49af4076d31b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.758481] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e67aa955-bc8b-4b1c-92bf-294bba94d266 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.771892] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1795.795397] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 58804be5-ee46-4b25-be84-890d5cd1607f could not be found. [ 1795.795833] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1795.795833] env[69648]: INFO nova.compute.manager [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1795.796125] env[69648]: DEBUG oslo.service.loopingcall [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1795.796406] env[69648]: DEBUG nova.compute.manager [-] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1795.796507] env[69648]: DEBUG nova.network.neutron [-] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1795.824882] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.825273] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1795.826666] env[69648]: INFO nova.compute.claims [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1795.837504] env[69648]: DEBUG nova.network.neutron [-] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1795.854161] env[69648]: INFO nova.compute.manager [-] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] Took 0.06 seconds to deallocate network for instance. [ 1795.941542] env[69648]: DEBUG oslo_concurrency.lockutils [None req-2c0ed535-a432-4e3a-a854-a848df2eab41 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "58804be5-ee46-4b25-be84-890d5cd1607f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.198s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1795.942787] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "58804be5-ee46-4b25-be84-890d5cd1607f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.928s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1795.942787] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 58804be5-ee46-4b25-be84-890d5cd1607f] During sync_power_state the instance has a pending task (deleting). Skip. [ 1795.942787] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "58804be5-ee46-4b25-be84-890d5cd1607f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1796.033835] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e21f9845-27b3-4933-aec9-145d7a2a7b63 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.041786] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce3e7963-d510-4d36-93e4-386bfbe78e8c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.072758] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c8b270f-30a4-428b-a2b1-a1030c712da4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.079938] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7966556-260d-4efc-8d3f-f9f6c76fe34b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.093204] env[69648]: DEBUG nova.compute.provider_tree [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1796.102084] env[69648]: DEBUG nova.scheduler.client.report [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1796.116539] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1796.117014] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1796.150997] env[69648]: DEBUG nova.compute.utils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1796.152395] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Not allocating networking since 'none' was specified. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1796.160679] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1796.224661] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1796.249773] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1796.250057] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1796.250224] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1796.250407] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1796.250555] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1796.250704] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1796.250910] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1796.251101] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1796.251308] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1796.251479] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1796.251656] env[69648]: DEBUG nova.virt.hardware [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1796.252532] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bb41efa-99cb-48a7-b266-b210f7974d9c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.260896] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c15b95d-1053-4b00-b917-fd973b3e0ef1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.274631] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Instance VIF info [] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1796.280398] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Creating folder: Project (62e197146c8d4e3991417a5c2b3f51c2). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1796.280693] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b7e1cf29-2618-46bf-a160-546bebd1a900 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.290726] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Created folder: Project (62e197146c8d4e3991417a5c2b3f51c2) in parent group-v692308. [ 1796.290911] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Creating folder: Instances. Parent ref: group-v692407. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1796.291143] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b89406bb-2be7-4147-bf06-9cdd7c5b80e2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.301062] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Created folder: Instances in parent group-v692407. [ 1796.301304] env[69648]: DEBUG oslo.service.loopingcall [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1796.301495] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1796.301908] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8772d74e-ac6c-4941-853d-afcbdb080b3e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.317777] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1796.317777] env[69648]: value = "task-3466645" [ 1796.317777] env[69648]: _type = "Task" [ 1796.317777] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1796.325039] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466645, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1796.828482] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466645, 'name': CreateVM_Task, 'duration_secs': 0.246794} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1796.828756] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1796.829201] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1796.829393] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1796.829714] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1796.829956] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b5fe112b-7dd3-4e16-a6b1-32ad4c6be7e1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.834155] env[69648]: DEBUG oslo_vmware.api [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Waiting for the task: (returnval){ [ 1796.834155] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5288929c-d92a-ac7e-20fc-3ca07cb65f18" [ 1796.834155] env[69648]: _type = "Task" [ 1796.834155] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1796.841148] env[69648]: DEBUG oslo_vmware.api [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5288929c-d92a-ac7e-20fc-3ca07cb65f18, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1797.344163] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1797.344474] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1797.345270] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1798.112582] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1798.112836] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1799.065661] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1799.908515] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "0a8de1d1-a783-4a32-9ee0-abb023943eeb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1801.064782] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1803.060694] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1806.065698] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1806.065941] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1806.065980] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1806.077292] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1806.077512] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1806.077681] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1806.077839] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1806.079364] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a339be57-01fa-401c-a01c-694eaf53c108 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.088048] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe47f44e-ae14-414a-902f-bef67c55a907 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.102370] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e009a9cd-a408-40ca-96d2-eba414c837b2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.108807] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13fe8baa-412f-4e66-bb3e-98d1f47d36fb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.137881] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1806.138052] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1806.138279] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1806.211183] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance c97308be-406b-4fd0-b502-69e8c800773f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.211359] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.211490] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.211616] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.211740] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.211860] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.211978] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 723972b1-3f91-4c59-b265-3975644dadb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.212166] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.212309] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.212430] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1806.223845] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3edf3b50-a4bf-4e75-927a-db78c433dbc4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1806.235370] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 31c84a7e-7a41-4d9f-ad29-6dad6648d85f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1806.244607] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1806.253520] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1806.253737] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1806.253884] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1806.411114] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dadcebeb-e8e3-43e3-aefc-3906240d6def {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.418467] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6a308bd-26d0-4b9c-a211-5c18c0907502 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.447775] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56081603-c699-4fc9-85b5-f1324dba38ae {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.454524] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f98fe492-8939-4f88-9338-39895a69e58f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.466913] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1806.475071] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1806.491304] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1806.491512] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.353s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1808.491413] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1810.066036] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1810.066036] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1810.066482] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1810.086857] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.087031] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.087144] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.087274] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.087401] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.087533] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.087692] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.087826] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.087950] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.088082] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1810.088206] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1817.812858] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "cb6b7f04-1c44-4998-bd28-8a01c4b235e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1817.812858] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "cb6b7f04-1c44-4998-bd28-8a01c4b235e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1833.706335] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "cc77a95f-ea00-4b01-96ac-8256672eeb39" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1833.706643] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "cc77a95f-ea00-4b01-96ac-8256672eeb39" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1844.715493] env[69648]: WARNING oslo_vmware.rw_handles [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1844.715493] env[69648]: ERROR oslo_vmware.rw_handles [ 1844.716223] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/2ea6ec61-42a1-41e0-b23a-4ccab2fad202/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1844.717984] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1844.718260] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Copying Virtual Disk [datastore1] vmware_temp/2ea6ec61-42a1-41e0-b23a-4ccab2fad202/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/2ea6ec61-42a1-41e0-b23a-4ccab2fad202/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1844.718547] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1e35b3e5-f824-4114-be7c-763fdb9607ea {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.726063] env[69648]: DEBUG oslo_vmware.api [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Waiting for the task: (returnval){ [ 1844.726063] env[69648]: value = "task-3466646" [ 1844.726063] env[69648]: _type = "Task" [ 1844.726063] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1844.733654] env[69648]: DEBUG oslo_vmware.api [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Task: {'id': task-3466646, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1845.236641] env[69648]: DEBUG oslo_vmware.exceptions [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1845.236975] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1845.237562] env[69648]: ERROR nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1845.237562] env[69648]: Faults: ['InvalidArgument'] [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] Traceback (most recent call last): [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] yield resources [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] self.driver.spawn(context, instance, image_meta, [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] self._fetch_image_if_missing(context, vi) [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1845.237562] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] image_cache(vi, tmp_image_ds_loc) [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] vm_util.copy_virtual_disk( [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] session._wait_for_task(vmdk_copy_task) [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] return self.wait_for_task(task_ref) [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] return evt.wait() [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] result = hub.switch() [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] return self.greenlet.switch() [ 1845.237952] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1845.238517] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] self.f(*self.args, **self.kw) [ 1845.238517] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1845.238517] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] raise exceptions.translate_fault(task_info.error) [ 1845.238517] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1845.238517] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] Faults: ['InvalidArgument'] [ 1845.238517] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] [ 1845.238517] env[69648]: INFO nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Terminating instance [ 1845.240078] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1845.240078] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1845.240078] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e46cb239-8e80-4218-ad05-d4f83ac72d25 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.242438] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1845.242629] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1845.243380] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e188adfc-634c-45c7-b108-4b5d1b64125b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.251041] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1845.251041] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1991dd6e-a7d4-4c29-9edd-cd4ea33304a5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.252240] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1845.252413] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1845.253411] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-21313e05-ed84-43b6-837e-4e628bcae249 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.258607] env[69648]: DEBUG oslo_vmware.api [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for the task: (returnval){ [ 1845.258607] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f1254f-0df8-913f-63cb-06ec1fbb52b8" [ 1845.258607] env[69648]: _type = "Task" [ 1845.258607] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1845.265645] env[69648]: DEBUG oslo_vmware.api [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f1254f-0df8-913f-63cb-06ec1fbb52b8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1845.309137] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1845.309373] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1845.309555] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Deleting the datastore file [datastore1] c97308be-406b-4fd0-b502-69e8c800773f {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1845.309851] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7ae7c230-a196-4c7f-9f7e-263d0aafc9c9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.316571] env[69648]: DEBUG oslo_vmware.api [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Waiting for the task: (returnval){ [ 1845.316571] env[69648]: value = "task-3466648" [ 1845.316571] env[69648]: _type = "Task" [ 1845.316571] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1845.324253] env[69648]: DEBUG oslo_vmware.api [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Task: {'id': task-3466648, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1845.769092] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1845.769459] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Creating directory with path [datastore1] vmware_temp/84324e3a-2109-418e-8741-c4d48c69cc10/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1845.769580] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-15011aec-b418-4885-ae1e-66651e928d62 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.780731] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Created directory with path [datastore1] vmware_temp/84324e3a-2109-418e-8741-c4d48c69cc10/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1845.780900] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Fetch image to [datastore1] vmware_temp/84324e3a-2109-418e-8741-c4d48c69cc10/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1845.781111] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/84324e3a-2109-418e-8741-c4d48c69cc10/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1845.781798] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fc4c77d-7400-4ffc-b191-707e06d12236 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.787939] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-730eecc4-5860-4e8c-918a-cd4a3d14303f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.796576] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-490b5c00-cf57-4752-85b3-88d9332c71e1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.830151] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb86bbea-bcda-4942-ab6c-b3e126db34e8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.836679] env[69648]: DEBUG oslo_vmware.api [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Task: {'id': task-3466648, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065377} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1845.838083] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1845.838322] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1845.838577] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1845.838844] env[69648]: INFO nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1845.840783] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3889cfca-ea53-470a-98db-8c8f9b241fbc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.842641] env[69648]: DEBUG nova.compute.claims [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1845.842877] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.843146] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.867156] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1846.013380] env[69648]: DEBUG oslo_vmware.rw_handles [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/84324e3a-2109-418e-8741-c4d48c69cc10/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1846.075884] env[69648]: DEBUG oslo_vmware.rw_handles [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1846.076102] env[69648]: DEBUG oslo_vmware.rw_handles [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/84324e3a-2109-418e-8741-c4d48c69cc10/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1846.117593] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aeac6d1-b71f-42ba-9592-8ef99ab6a8ec {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.125181] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60065072-ea00-4912-9679-1b695cc85bbf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.154774] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dda96a80-b566-4eab-a80c-97c44020a8dd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.161763] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-677b9d3c-ab54-4930-987b-81b46bd2ad9b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.175452] env[69648]: DEBUG nova.compute.provider_tree [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1846.183676] env[69648]: DEBUG nova.scheduler.client.report [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1846.197893] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.355s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.198430] env[69648]: ERROR nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1846.198430] env[69648]: Faults: ['InvalidArgument'] [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] Traceback (most recent call last): [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] self.driver.spawn(context, instance, image_meta, [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] self._fetch_image_if_missing(context, vi) [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] image_cache(vi, tmp_image_ds_loc) [ 1846.198430] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] vm_util.copy_virtual_disk( [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] session._wait_for_task(vmdk_copy_task) [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] return self.wait_for_task(task_ref) [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] return evt.wait() [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] result = hub.switch() [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] return self.greenlet.switch() [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1846.198865] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] self.f(*self.args, **self.kw) [ 1846.199302] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1846.199302] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] raise exceptions.translate_fault(task_info.error) [ 1846.199302] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1846.199302] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] Faults: ['InvalidArgument'] [ 1846.199302] env[69648]: ERROR nova.compute.manager [instance: c97308be-406b-4fd0-b502-69e8c800773f] [ 1846.199302] env[69648]: DEBUG nova.compute.utils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1846.200914] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Build of instance c97308be-406b-4fd0-b502-69e8c800773f was re-scheduled: A specified parameter was not correct: fileType [ 1846.200914] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1846.201352] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1846.201532] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1846.201702] env[69648]: DEBUG nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1846.201866] env[69648]: DEBUG nova.network.neutron [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1846.649927] env[69648]: DEBUG nova.network.neutron [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1846.661574] env[69648]: INFO nova.compute.manager [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Took 0.46 seconds to deallocate network for instance. [ 1846.775354] env[69648]: INFO nova.scheduler.client.report [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Deleted allocations for instance c97308be-406b-4fd0-b502-69e8c800773f [ 1846.794056] env[69648]: DEBUG oslo_concurrency.lockutils [None req-728117e5-57ed-484c-b5a1-862ac85d6f6c tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "c97308be-406b-4fd0-b502-69e8c800773f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 637.668s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.795112] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "c97308be-406b-4fd0-b502-69e8c800773f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 441.748s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.795333] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Acquiring lock "c97308be-406b-4fd0-b502-69e8c800773f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1846.795531] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "c97308be-406b-4fd0-b502-69e8c800773f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.795691] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "c97308be-406b-4fd0-b502-69e8c800773f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.797693] env[69648]: INFO nova.compute.manager [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Terminating instance [ 1846.799461] env[69648]: DEBUG nova.compute.manager [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1846.799670] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1846.800169] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-39c9b965-f194-46d3-a372-de7d3b882b6b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.807989] env[69648]: DEBUG nova.compute.manager [None req-19214503-4e9f-4079-8707-b20bcb05e730 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 3edf3b50-a4bf-4e75-927a-db78c433dbc4] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1846.813984] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9f0a505-c67c-431d-aa02-24a9ecf5fa7b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.831753] env[69648]: DEBUG nova.compute.manager [None req-19214503-4e9f-4079-8707-b20bcb05e730 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: 3edf3b50-a4bf-4e75-927a-db78c433dbc4] Instance disappeared before build. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1846.842777] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c97308be-406b-4fd0-b502-69e8c800773f could not be found. [ 1846.842975] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1846.843190] env[69648]: INFO nova.compute.manager [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1846.843433] env[69648]: DEBUG oslo.service.loopingcall [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1846.843652] env[69648]: DEBUG nova.compute.manager [-] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1846.843750] env[69648]: DEBUG nova.network.neutron [-] [instance: c97308be-406b-4fd0-b502-69e8c800773f] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1846.855620] env[69648]: DEBUG oslo_concurrency.lockutils [None req-19214503-4e9f-4079-8707-b20bcb05e730 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "3edf3b50-a4bf-4e75-927a-db78c433dbc4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.846s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.864981] env[69648]: DEBUG nova.compute.manager [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1846.867674] env[69648]: DEBUG nova.network.neutron [-] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1846.877586] env[69648]: INFO nova.compute.manager [-] [instance: c97308be-406b-4fd0-b502-69e8c800773f] Took 0.03 seconds to deallocate network for instance. [ 1846.923376] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1846.923626] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.925179] env[69648]: INFO nova.compute.claims [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1846.989799] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f43598ec-74e9-48e8-8570-2c94fa9d94c2 tempest-ServersTestJSON-47170274 tempest-ServersTestJSON-47170274-project-member] Lock "c97308be-406b-4fd0-b502-69e8c800773f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.990657] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "c97308be-406b-4fd0-b502-69e8c800773f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 51.976s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.990854] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: c97308be-406b-4fd0-b502-69e8c800773f] During sync_power_state the instance has a pending task (deleting). Skip. [ 1846.991039] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "c97308be-406b-4fd0-b502-69e8c800773f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.149575] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd066fb1-13bb-451c-aca2-23f91fc073c2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.157458] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-983264b2-07ee-456e-a198-f154566b34f2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.186383] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1aab9ca6-25f1-40d1-808a-670e2c7a1294 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.193456] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-178e7262-4218-4472-9b78-a596e8e28609 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.207036] env[69648]: DEBUG nova.compute.provider_tree [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1847.216329] env[69648]: DEBUG nova.scheduler.client.report [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1847.231036] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.307s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.231036] env[69648]: DEBUG nova.compute.manager [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1847.268125] env[69648]: DEBUG nova.compute.utils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1847.269495] env[69648]: DEBUG nova.compute.manager [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1847.269678] env[69648]: DEBUG nova.network.neutron [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1847.278706] env[69648]: DEBUG nova.compute.manager [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1847.309612] env[69648]: INFO nova.virt.block_device [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Booting with volume 704cedaf-839e-4387-95a7-76600b742c77 at /dev/sda [ 1847.333645] env[69648]: DEBUG nova.policy [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7ff2342cf44f4329aa84ed5fb211e683', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c54987c4289e424e80abbc7dfcdb5547', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1847.357281] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c082324a-860a-4504-93dd-3e1710491c91 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.365959] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68aef4ef-f6f2-42b3-aa15-576fa8472311 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.400453] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-dba70625-e9fa-4f81-858a-872fd24c227c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.407922] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf52e5d6-9fa2-47cc-86d8-9abbfcef5ead {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.436399] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-898d8508-f4c2-4373-9560-28194c0c95a0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.442563] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0faebcf-2d0b-4eab-bcf0-9c720e5e90f4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.455561] env[69648]: DEBUG nova.virt.block_device [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updating existing volume attachment record: 384da1e1-f6e5-499c-8df3-3a9d7c811a0e {{(pid=69648) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1847.672170] env[69648]: DEBUG nova.compute.manager [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1847.672170] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1847.672170] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1847.672323] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1847.672323] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1847.672323] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1847.672323] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1847.676050] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1847.676050] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1847.676050] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1847.676338] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1847.676575] env[69648]: DEBUG nova.virt.hardware [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1847.678253] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33923363-c6a5-450c-8537-33a4c6f4d078 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.689855] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77a6a810-20d2-4764-b45c-3a007471db22 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.696111] env[69648]: DEBUG nova.network.neutron [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Successfully created port: 7c126308-8c1c-4227-8b18-c51ab893b6fd {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1848.335281] env[69648]: DEBUG nova.network.neutron [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Successfully updated port: 7c126308-8c1c-4227-8b18-c51ab893b6fd {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1848.345243] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Acquiring lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1848.345472] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Acquired lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1848.345680] env[69648]: DEBUG nova.network.neutron [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1848.387236] env[69648]: DEBUG nova.network.neutron [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1848.545447] env[69648]: DEBUG nova.network.neutron [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updating instance_info_cache with network_info: [{"id": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "address": "fa:16:3e:74:e1:49", "network": {"id": "da58d28c-4c27-429d-9fb8-fed88a7172d2", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1196888857-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c54987c4289e424e80abbc7dfcdb5547", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec3f9e71-839a-429d-b211-d3dfc98ca4f6", "external-id": "nsx-vlan-transportzone-5", "segmentation_id": 5, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c126308-8c", "ovs_interfaceid": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1848.558982] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Releasing lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1848.559327] env[69648]: DEBUG nova.compute.manager [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Instance network_info: |[{"id": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "address": "fa:16:3e:74:e1:49", "network": {"id": "da58d28c-4c27-429d-9fb8-fed88a7172d2", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1196888857-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c54987c4289e424e80abbc7dfcdb5547", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec3f9e71-839a-429d-b211-d3dfc98ca4f6", "external-id": "nsx-vlan-transportzone-5", "segmentation_id": 5, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c126308-8c", "ovs_interfaceid": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1848.559741] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:74:e1:49', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ec3f9e71-839a-429d-b211-d3dfc98ca4f6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7c126308-8c1c-4227-8b18-c51ab893b6fd', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1848.567089] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Creating folder: Project (c54987c4289e424e80abbc7dfcdb5547). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1848.567592] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3eb31a39-0afa-4318-83ee-4276755c7d78 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.581282] env[69648]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1848.581458] env[69648]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=69648) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 1848.581753] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Folder already exists: Project (c54987c4289e424e80abbc7dfcdb5547). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1848.581943] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Creating folder: Instances. Parent ref: group-v692401. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1848.582177] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6ca109ae-39e2-4fb7-9770-a03a522de5f9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.590728] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Created folder: Instances in parent group-v692401. [ 1848.590968] env[69648]: DEBUG oslo.service.loopingcall [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1848.591164] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1848.591353] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5f53e242-c59e-469f-bb26-c1d92612b284 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.609740] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1848.609740] env[69648]: value = "task-3466651" [ 1848.609740] env[69648]: _type = "Task" [ 1848.609740] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1848.616830] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466651, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1848.687173] env[69648]: DEBUG nova.compute.manager [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Received event network-vif-plugged-7c126308-8c1c-4227-8b18-c51ab893b6fd {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1848.687173] env[69648]: DEBUG oslo_concurrency.lockutils [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] Acquiring lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1848.687173] env[69648]: DEBUG oslo_concurrency.lockutils [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] Lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1848.687411] env[69648]: DEBUG oslo_concurrency.lockutils [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] Lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1848.687509] env[69648]: DEBUG nova.compute.manager [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] No waiting events found dispatching network-vif-plugged-7c126308-8c1c-4227-8b18-c51ab893b6fd {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1848.687675] env[69648]: WARNING nova.compute.manager [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Received unexpected event network-vif-plugged-7c126308-8c1c-4227-8b18-c51ab893b6fd for instance with vm_state building and task_state spawning. [ 1848.687887] env[69648]: DEBUG nova.compute.manager [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Received event network-changed-7c126308-8c1c-4227-8b18-c51ab893b6fd {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1848.688175] env[69648]: DEBUG nova.compute.manager [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Refreshing instance network info cache due to event network-changed-7c126308-8c1c-4227-8b18-c51ab893b6fd. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1848.688470] env[69648]: DEBUG oslo_concurrency.lockutils [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] Acquiring lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1848.688790] env[69648]: DEBUG oslo_concurrency.lockutils [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] Acquired lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1848.689175] env[69648]: DEBUG nova.network.neutron [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Refreshing network info cache for port 7c126308-8c1c-4227-8b18-c51ab893b6fd {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1848.939916] env[69648]: DEBUG nova.network.neutron [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updated VIF entry in instance network info cache for port 7c126308-8c1c-4227-8b18-c51ab893b6fd. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1848.940310] env[69648]: DEBUG nova.network.neutron [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updating instance_info_cache with network_info: [{"id": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "address": "fa:16:3e:74:e1:49", "network": {"id": "da58d28c-4c27-429d-9fb8-fed88a7172d2", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1196888857-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c54987c4289e424e80abbc7dfcdb5547", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec3f9e71-839a-429d-b211-d3dfc98ca4f6", "external-id": "nsx-vlan-transportzone-5", "segmentation_id": 5, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c126308-8c", "ovs_interfaceid": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1848.952050] env[69648]: DEBUG oslo_concurrency.lockutils [req-193b59b2-9c16-4251-a5cc-d0599c500772 req-03d7d505-7f6d-4d85-af84-ae4d329b7590 service nova] Releasing lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1849.120097] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466651, 'name': CreateVM_Task, 'duration_secs': 0.310996} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1849.120283] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1849.127324] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'delete_on_termination': True, 'attachment_id': '384da1e1-f6e5-499c-8df3-3a9d7c811a0e', 'disk_bus': None, 'guest_format': None, 'device_type': None, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692404', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'name': 'volume-704cedaf-839e-4387-95a7-76600b742c77', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '31c84a7e-7a41-4d9f-ad29-6dad6648d85f', 'attached_at': '', 'detached_at': '', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'serial': '704cedaf-839e-4387-95a7-76600b742c77'}, 'mount_device': '/dev/sda', 'volume_type': None}], 'swap': None} {{(pid=69648) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1849.127552] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Root volume attach. Driver type: vmdk {{(pid=69648) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1849.128317] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f853761-31e5-4749-96bd-d568a9afb25c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1849.136243] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc98c2e6-f232-460c-9b67-8e96a41ffe82 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1849.141843] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19af8f72-cc81-42b0-9f5e-47bd58ed4a69 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1849.147753] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-6ab8ff6f-2e50-404d-bbdd-d71eb4af87a3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1849.154759] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for the task: (returnval){ [ 1849.154759] env[69648]: value = "task-3466652" [ 1849.154759] env[69648]: _type = "Task" [ 1849.154759] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1849.162533] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466652, 'name': RelocateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1849.667121] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466652, 'name': RelocateVM_Task} progress is 42%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1850.169459] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466652, 'name': RelocateVM_Task} progress is 56%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1850.669537] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466652, 'name': RelocateVM_Task} progress is 71%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1851.172363] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466652, 'name': RelocateVM_Task} progress is 86%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1851.670362] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466652, 'name': RelocateVM_Task} progress is 97%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.171425] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466652, 'name': RelocateVM_Task, 'duration_secs': 2.929999} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1852.171667] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Volume attach. Driver type: vmdk {{(pid=69648) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1852.171843] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692404', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'name': 'volume-704cedaf-839e-4387-95a7-76600b742c77', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '31c84a7e-7a41-4d9f-ad29-6dad6648d85f', 'attached_at': '', 'detached_at': '', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'serial': '704cedaf-839e-4387-95a7-76600b742c77'} {{(pid=69648) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1852.172609] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3680c415-1990-45ec-ba95-bb95ccd46198 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.189081] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45cf07fa-f553-4421-a480-ad8862fd742e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.211221] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Reconfiguring VM instance instance-00000058 to attach disk [datastore1] volume-704cedaf-839e-4387-95a7-76600b742c77/volume-704cedaf-839e-4387-95a7-76600b742c77.vmdk or device None with type thin {{(pid=69648) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1852.211476] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-6fb91482-973f-4ed4-9d9a-e1356286f22f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.232135] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for the task: (returnval){ [ 1852.232135] env[69648]: value = "task-3466653" [ 1852.232135] env[69648]: _type = "Task" [ 1852.232135] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1852.239295] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466653, 'name': ReconfigVM_Task} progress is 5%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.741919] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466653, 'name': ReconfigVM_Task, 'duration_secs': 0.339319} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1852.742587] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Reconfigured VM instance instance-00000058 to attach disk [datastore1] volume-704cedaf-839e-4387-95a7-76600b742c77/volume-704cedaf-839e-4387-95a7-76600b742c77.vmdk or device None with type thin {{(pid=69648) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1852.747379] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-6941d85c-5d33-4a48-816a-8dcd68f6909b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.762032] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for the task: (returnval){ [ 1852.762032] env[69648]: value = "task-3466654" [ 1852.762032] env[69648]: _type = "Task" [ 1852.762032] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1852.768806] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466654, 'name': ReconfigVM_Task} progress is 5%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1853.271530] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466654, 'name': ReconfigVM_Task, 'duration_secs': 0.122668} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1853.271785] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692404', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'name': 'volume-704cedaf-839e-4387-95a7-76600b742c77', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '31c84a7e-7a41-4d9f-ad29-6dad6648d85f', 'attached_at': '', 'detached_at': '', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'serial': '704cedaf-839e-4387-95a7-76600b742c77'} {{(pid=69648) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1853.272398] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-3540d621-2e0e-4060-8226-8a411c7c3766 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.278380] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for the task: (returnval){ [ 1853.278380] env[69648]: value = "task-3466655" [ 1853.278380] env[69648]: _type = "Task" [ 1853.278380] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1853.286883] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466655, 'name': Rename_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1853.788514] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466655, 'name': Rename_Task, 'duration_secs': 0.119544} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1853.788840] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Powering on the VM {{(pid=69648) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1853.789032] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-f056056a-61c6-496e-9fba-6ac817404c32 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.795575] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for the task: (returnval){ [ 1853.795575] env[69648]: value = "task-3466656" [ 1853.795575] env[69648]: _type = "Task" [ 1853.795575] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1853.802895] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466656, 'name': PowerOnVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1854.306076] env[69648]: DEBUG oslo_vmware.api [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466656, 'name': PowerOnVM_Task, 'duration_secs': 0.437168} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1854.306346] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Powered on the VM {{(pid=69648) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1854.306554] env[69648]: INFO nova.compute.manager [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Took 6.64 seconds to spawn the instance on the hypervisor. [ 1854.306839] env[69648]: DEBUG nova.compute.manager [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Checking state {{(pid=69648) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 1854.307916] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-347c29bf-0781-4fc9-ad35-534797dceedb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.359463] env[69648]: INFO nova.compute.manager [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Took 7.45 seconds to build instance. [ 1854.374637] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ffe02e31-457f-4423-95a4-4f9aa65afb91 tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 183.916s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1854.383515] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1854.436933] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1854.437230] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1854.438893] env[69648]: INFO nova.compute.claims [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1854.659204] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-852e3b23-9d57-4be9-8cd1-d0fc4a231469 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.666541] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40c46770-d2f5-437d-9cc3-b97d36127af8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.697726] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24d65f39-0d95-4811-967e-883467fdbd66 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.704622] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0da6c434-891b-4e26-88c5-5d25c7fdeddb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.718508] env[69648]: DEBUG nova.compute.provider_tree [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1854.727063] env[69648]: DEBUG nova.scheduler.client.report [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1854.739338] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1854.739788] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1854.771219] env[69648]: DEBUG nova.compute.utils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1854.772501] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1854.772671] env[69648]: DEBUG nova.network.neutron [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1854.782308] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1854.830248] env[69648]: DEBUG nova.policy [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0a1e78d39d744d39b01da61d52a96c36', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '73994a87306e4ce088729c3bb5476f3e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1854.847146] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1854.872157] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1854.872416] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1854.872575] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1854.872757] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1854.872906] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1854.873068] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1854.873282] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1854.873442] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1854.873608] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1854.873771] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1854.873942] env[69648]: DEBUG nova.virt.hardware [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1854.874817] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74d7995-620f-45a1-bbd4-0afec0fe55e4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.882491] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abee5ec1-0828-47fd-ad2f-2b7240631c25 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.145036] env[69648]: DEBUG nova.network.neutron [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Successfully created port: 6b8b6945-83c1-422f-b6e3-b844d4d7898d {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1856.200750] env[69648]: DEBUG nova.compute.manager [req-6b84be10-ce3f-422c-a1e3-7029f52a895a req-15fa1fa0-8aef-4dc1-9de0-297ca22c8bb1 service nova] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Received event network-vif-plugged-6b8b6945-83c1-422f-b6e3-b844d4d7898d {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1856.200994] env[69648]: DEBUG oslo_concurrency.lockutils [req-6b84be10-ce3f-422c-a1e3-7029f52a895a req-15fa1fa0-8aef-4dc1-9de0-297ca22c8bb1 service nova] Acquiring lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1856.201221] env[69648]: DEBUG oslo_concurrency.lockutils [req-6b84be10-ce3f-422c-a1e3-7029f52a895a req-15fa1fa0-8aef-4dc1-9de0-297ca22c8bb1 service nova] Lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1856.201392] env[69648]: DEBUG oslo_concurrency.lockutils [req-6b84be10-ce3f-422c-a1e3-7029f52a895a req-15fa1fa0-8aef-4dc1-9de0-297ca22c8bb1 service nova] Lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1856.201564] env[69648]: DEBUG nova.compute.manager [req-6b84be10-ce3f-422c-a1e3-7029f52a895a req-15fa1fa0-8aef-4dc1-9de0-297ca22c8bb1 service nova] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] No waiting events found dispatching network-vif-plugged-6b8b6945-83c1-422f-b6e3-b844d4d7898d {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1856.201731] env[69648]: WARNING nova.compute.manager [req-6b84be10-ce3f-422c-a1e3-7029f52a895a req-15fa1fa0-8aef-4dc1-9de0-297ca22c8bb1 service nova] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Received unexpected event network-vif-plugged-6b8b6945-83c1-422f-b6e3-b844d4d7898d for instance with vm_state building and task_state spawning. [ 1856.390473] env[69648]: DEBUG nova.network.neutron [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Successfully updated port: 6b8b6945-83c1-422f-b6e3-b844d4d7898d {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1856.402170] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "refresh_cache-3d53af88-d0ea-4aff-a36b-23eb2c07bd68" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1856.402313] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquired lock "refresh_cache-3d53af88-d0ea-4aff-a36b-23eb2c07bd68" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1856.402468] env[69648]: DEBUG nova.network.neutron [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1856.473887] env[69648]: DEBUG nova.network.neutron [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1856.966824] env[69648]: DEBUG nova.network.neutron [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Updating instance_info_cache with network_info: [{"id": "6b8b6945-83c1-422f-b6e3-b844d4d7898d", "address": "fa:16:3e:d6:18:a3", "network": {"id": "0ff6cf3f-cb54-4cbd-b96e-0612988b03df", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1838163903-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "73994a87306e4ce088729c3bb5476f3e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b8b6945-83", "ovs_interfaceid": "6b8b6945-83c1-422f-b6e3-b844d4d7898d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1856.992448] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Releasing lock "refresh_cache-3d53af88-d0ea-4aff-a36b-23eb2c07bd68" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1856.993271] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Instance network_info: |[{"id": "6b8b6945-83c1-422f-b6e3-b844d4d7898d", "address": "fa:16:3e:d6:18:a3", "network": {"id": "0ff6cf3f-cb54-4cbd-b96e-0612988b03df", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1838163903-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "73994a87306e4ce088729c3bb5476f3e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b8b6945-83", "ovs_interfaceid": "6b8b6945-83c1-422f-b6e3-b844d4d7898d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1856.993635] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d6:18:a3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '113aa98d-90ca-43bc-a534-8908d1ec7d15', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6b8b6945-83c1-422f-b6e3-b844d4d7898d', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1857.002104] env[69648]: DEBUG oslo.service.loopingcall [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1857.002648] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1857.002893] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e71e6324-fd37-45e7-975d-ba1ce300fa23 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.024000] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1857.024000] env[69648]: value = "task-3466657" [ 1857.024000] env[69648]: _type = "Task" [ 1857.024000] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1857.032430] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466657, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1857.534669] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466657, 'name': CreateVM_Task, 'duration_secs': 0.308832} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1857.536586] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1857.536586] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1857.536586] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1857.536586] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1857.536861] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e5c61411-bf1e-4598-b2ab-5f4473c1dd7f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.540968] env[69648]: DEBUG oslo_vmware.api [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for the task: (returnval){ [ 1857.540968] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52822ba5-923d-bc74-9fc9-e0550347e939" [ 1857.540968] env[69648]: _type = "Task" [ 1857.540968] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1857.549838] env[69648]: DEBUG oslo_vmware.api [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52822ba5-923d-bc74-9fc9-e0550347e939, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1858.051958] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1858.052393] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1858.052480] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1858.237097] env[69648]: DEBUG nova.compute.manager [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Received event network-changed-6b8b6945-83c1-422f-b6e3-b844d4d7898d {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1858.237312] env[69648]: DEBUG nova.compute.manager [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Refreshing instance network info cache due to event network-changed-6b8b6945-83c1-422f-b6e3-b844d4d7898d. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1858.237530] env[69648]: DEBUG oslo_concurrency.lockutils [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] Acquiring lock "refresh_cache-3d53af88-d0ea-4aff-a36b-23eb2c07bd68" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1858.237679] env[69648]: DEBUG oslo_concurrency.lockutils [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] Acquired lock "refresh_cache-3d53af88-d0ea-4aff-a36b-23eb2c07bd68" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1858.237848] env[69648]: DEBUG nova.network.neutron [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Refreshing network info cache for port 6b8b6945-83c1-422f-b6e3-b844d4d7898d {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1858.499793] env[69648]: DEBUG nova.network.neutron [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Updated VIF entry in instance network info cache for port 6b8b6945-83c1-422f-b6e3-b844d4d7898d. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1858.500156] env[69648]: DEBUG nova.network.neutron [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Updating instance_info_cache with network_info: [{"id": "6b8b6945-83c1-422f-b6e3-b844d4d7898d", "address": "fa:16:3e:d6:18:a3", "network": {"id": "0ff6cf3f-cb54-4cbd-b96e-0612988b03df", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1838163903-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "73994a87306e4ce088729c3bb5476f3e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b8b6945-83", "ovs_interfaceid": "6b8b6945-83c1-422f-b6e3-b844d4d7898d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1858.509203] env[69648]: DEBUG oslo_concurrency.lockutils [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] Releasing lock "refresh_cache-3d53af88-d0ea-4aff-a36b-23eb2c07bd68" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1858.509388] env[69648]: DEBUG nova.compute.manager [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Received event network-changed-7c126308-8c1c-4227-8b18-c51ab893b6fd {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1858.509557] env[69648]: DEBUG nova.compute.manager [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Refreshing instance network info cache due to event network-changed-7c126308-8c1c-4227-8b18-c51ab893b6fd. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1858.509761] env[69648]: DEBUG oslo_concurrency.lockutils [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] Acquiring lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1858.509901] env[69648]: DEBUG oslo_concurrency.lockutils [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] Acquired lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1858.510072] env[69648]: DEBUG nova.network.neutron [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Refreshing network info cache for port 7c126308-8c1c-4227-8b18-c51ab893b6fd {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1858.764564] env[69648]: DEBUG nova.network.neutron [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updated VIF entry in instance network info cache for port 7c126308-8c1c-4227-8b18-c51ab893b6fd. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1858.764923] env[69648]: DEBUG nova.network.neutron [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updating instance_info_cache with network_info: [{"id": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "address": "fa:16:3e:74:e1:49", "network": {"id": "da58d28c-4c27-429d-9fb8-fed88a7172d2", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1196888857-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c54987c4289e424e80abbc7dfcdb5547", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec3f9e71-839a-429d-b211-d3dfc98ca4f6", "external-id": "nsx-vlan-transportzone-5", "segmentation_id": 5, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c126308-8c", "ovs_interfaceid": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1858.775788] env[69648]: DEBUG oslo_concurrency.lockutils [req-966acfdb-db86-421a-80e8-35bc39580609 req-3472113b-5418-4321-8614-92be00d65614 service nova] Releasing lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1859.064908] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1859.090034] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1859.090168] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1860.065398] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1862.065395] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1863.060613] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1866.067534] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1866.067953] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1866.067953] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1866.079758] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1866.079980] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1866.080168] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1866.080326] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1866.081457] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cfe12f2-f930-49bd-9633-ef0575a28b67 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.090376] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d916b22d-d134-4758-a2ae-5bb41ce0256b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.103983] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa1152d4-97a1-463a-8a32-4a6349a8352a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.110430] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aa16f28-0ad8-469b-ba6c-8a5050f69c96 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.141885] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180922MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1866.142052] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1866.142252] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1866.219720] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.219890] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.220031] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.220163] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.220284] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.220403] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 723972b1-3f91-4c59-b265-3975644dadb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.220522] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.220638] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.220753] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.220868] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 31c84a7e-7a41-4d9f-ad29-6dad6648d85f actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.220982] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1866.235267] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1866.245374] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1866.255811] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cc77a95f-ea00-4b01-96ac-8256672eeb39 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1866.256044] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 11 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1866.256199] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1920MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=11 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1866.405227] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5360c58-5f29-4838-9b81-47be47ad6399 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.412764] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b7c1a1f-1ebe-40c1-b70f-cece15b24910 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.441811] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54edb504-b00f-4781-8f75-6fa29c76da7c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.448880] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5a02add-7f49-4be1-85f9-659f0a1962fb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1866.462291] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1866.470742] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1866.484165] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1866.484346] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.342s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1868.482649] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1871.067288] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1871.067288] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1871.067288] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1871.088429] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.088594] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.088717] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.088846] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.088995] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.089139] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.089264] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.089382] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.089499] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.089623] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1871.142924] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1871.143099] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquired lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1871.143312] env[69648]: DEBUG nova.network.neutron [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Forcefully refreshing network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2003}} [ 1871.143437] env[69648]: DEBUG nova.objects.instance [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lazy-loading 'info_cache' on Instance uuid 31c84a7e-7a41-4d9f-ad29-6dad6648d85f {{(pid=69648) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 1871.435542] env[69648]: DEBUG nova.network.neutron [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updating instance_info_cache with network_info: [{"id": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "address": "fa:16:3e:74:e1:49", "network": {"id": "da58d28c-4c27-429d-9fb8-fed88a7172d2", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1196888857-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c54987c4289e424e80abbc7dfcdb5547", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec3f9e71-839a-429d-b211-d3dfc98ca4f6", "external-id": "nsx-vlan-transportzone-5", "segmentation_id": 5, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c126308-8c", "ovs_interfaceid": "7c126308-8c1c-4227-8b18-c51ab893b6fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1871.445144] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Releasing lock "refresh_cache-31c84a7e-7a41-4d9f-ad29-6dad6648d85f" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1871.445338] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updated the network info_cache for instance {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9988}} [ 1872.826158] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Acquiring lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1872.826474] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1872.826710] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Acquiring lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1872.826786] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1872.826939] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1872.829776] env[69648]: INFO nova.compute.manager [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Terminating instance [ 1872.832155] env[69648]: DEBUG nova.compute.manager [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1872.832374] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Powering off the VM {{(pid=69648) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1872.832826] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-229291fc-9410-4e6a-8e31-accbbb83e38a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1872.840531] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for the task: (returnval){ [ 1872.840531] env[69648]: value = "task-3466658" [ 1872.840531] env[69648]: _type = "Task" [ 1872.840531] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1872.848496] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466658, 'name': PowerOffVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1873.350564] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466658, 'name': PowerOffVM_Task, 'duration_secs': 0.204508} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1873.350819] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Powered off the VM {{(pid=69648) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 1873.351028] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Volume detach. Driver type: vmdk {{(pid=69648) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1873.351230] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692404', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'name': 'volume-704cedaf-839e-4387-95a7-76600b742c77', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '31c84a7e-7a41-4d9f-ad29-6dad6648d85f', 'attached_at': '', 'detached_at': '', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'serial': '704cedaf-839e-4387-95a7-76600b742c77'} {{(pid=69648) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1873.351967] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06fdb7ad-c47f-47dd-87ab-7e13ee8f6ecc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1873.369505] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc7836ca-b330-40e7-b205-fb1b2f760394 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1873.375417] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae80d47f-7072-4b3c-be4d-8a0ac21845e3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1873.393017] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b05a0743-aec9-4f08-be5a-d54b66d868e8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1873.407051] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] The volume has not been displaced from its original location: [datastore1] volume-704cedaf-839e-4387-95a7-76600b742c77/volume-704cedaf-839e-4387-95a7-76600b742c77.vmdk. No consolidation needed. {{(pid=69648) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 1873.412274] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Reconfiguring VM instance instance-00000058 to detach disk 2000 {{(pid=69648) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 1873.412517] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-6d82b7c5-326e-45f9-aa7a-ea3f10424619 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1873.429500] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for the task: (returnval){ [ 1873.429500] env[69648]: value = "task-3466659" [ 1873.429500] env[69648]: _type = "Task" [ 1873.429500] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1873.438924] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466659, 'name': ReconfigVM_Task} progress is 6%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1873.940236] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466659, 'name': ReconfigVM_Task, 'duration_secs': 0.145087} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1873.940548] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Reconfigured VM instance instance-00000058 to detach disk 2000 {{(pid=69648) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 1873.945177] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-e1c742e9-2dc9-4617-bb0d-a6fe3eea7db0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1873.959665] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for the task: (returnval){ [ 1873.959665] env[69648]: value = "task-3466660" [ 1873.959665] env[69648]: _type = "Task" [ 1873.959665] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1873.967286] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466660, 'name': ReconfigVM_Task} progress is 5%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1874.469797] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466660, 'name': ReconfigVM_Task, 'duration_secs': 0.127547} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1874.470048] env[69648]: DEBUG nova.virt.vmwareapi.volumeops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692404', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'name': 'volume-704cedaf-839e-4387-95a7-76600b742c77', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '31c84a7e-7a41-4d9f-ad29-6dad6648d85f', 'attached_at': '', 'detached_at': '', 'volume_id': '704cedaf-839e-4387-95a7-76600b742c77', 'serial': '704cedaf-839e-4387-95a7-76600b742c77'} {{(pid=69648) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 1874.470343] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1874.471118] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db469ab2-eafd-4485-83e7-ccadbcf1af2b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.477327] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1874.477545] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dd5b17b8-941a-4239-8340-e6257e36767f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.541365] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1874.541602] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1874.541789] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Deleting the datastore file [datastore1] 31c84a7e-7a41-4d9f-ad29-6dad6648d85f {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1874.542081] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fa0ed234-05c2-4a2e-8d6f-cfbfb1c76057 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.547948] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for the task: (returnval){ [ 1874.547948] env[69648]: value = "task-3466662" [ 1874.547948] env[69648]: _type = "Task" [ 1874.547948] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1874.555515] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466662, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1875.058053] env[69648]: DEBUG oslo_vmware.api [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Task: {'id': task-3466662, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068516} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1875.058366] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1875.058476] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1875.058653] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1875.058825] env[69648]: INFO nova.compute.manager [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Took 2.23 seconds to destroy the instance on the hypervisor. [ 1875.059080] env[69648]: DEBUG oslo.service.loopingcall [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1875.059269] env[69648]: DEBUG nova.compute.manager [-] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1875.059365] env[69648]: DEBUG nova.network.neutron [-] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1875.793133] env[69648]: DEBUG nova.network.neutron [-] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1875.811287] env[69648]: DEBUG nova.compute.manager [req-9eace059-18c1-43ec-b2b7-e64725b95956 req-fa5cf665-ab7b-43a2-8b9a-90f636c2872f service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Received event network-vif-deleted-7c126308-8c1c-4227-8b18-c51ab893b6fd {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1875.811477] env[69648]: INFO nova.compute.manager [req-9eace059-18c1-43ec-b2b7-e64725b95956 req-fa5cf665-ab7b-43a2-8b9a-90f636c2872f service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Neutron deleted interface 7c126308-8c1c-4227-8b18-c51ab893b6fd; detaching it from the instance and deleting it from the info cache [ 1875.811650] env[69648]: DEBUG nova.network.neutron [req-9eace059-18c1-43ec-b2b7-e64725b95956 req-fa5cf665-ab7b-43a2-8b9a-90f636c2872f service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1875.813869] env[69648]: INFO nova.compute.manager [-] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Took 0.75 seconds to deallocate network for instance. [ 1875.822299] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-26bb5f7a-9803-4111-bb60-6808659daebb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.832221] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5faf3ab0-1356-4191-867e-e3ce6884da1c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.864079] env[69648]: DEBUG nova.compute.manager [req-9eace059-18c1-43ec-b2b7-e64725b95956 req-fa5cf665-ab7b-43a2-8b9a-90f636c2872f service nova] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Detach interface failed, port_id=7c126308-8c1c-4227-8b18-c51ab893b6fd, reason: Instance 31c84a7e-7a41-4d9f-ad29-6dad6648d85f could not be found. {{(pid=69648) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10941}} [ 1875.892334] env[69648]: INFO nova.compute.manager [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Took 0.08 seconds to detach 1 volumes for instance. [ 1875.895925] env[69648]: DEBUG nova.compute.manager [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Deleting volume: 704cedaf-839e-4387-95a7-76600b742c77 {{(pid=69648) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3222}} [ 1875.995302] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1875.995573] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1875.995792] env[69648]: DEBUG nova.objects.instance [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lazy-loading 'resources' on Instance uuid 31c84a7e-7a41-4d9f-ad29-6dad6648d85f {{(pid=69648) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 1876.232558] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02d0fa87-8a36-4cfd-ab0e-8435427f46db {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.240363] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f91fbe1-eef9-4a65-90bb-d25c8db89a94 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.273579] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5316ec85-3a66-476c-b528-886b637e1bf3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.281221] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-485fa3a0-0e17-4d02-ac29-0134664a0bb2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.295547] env[69648]: DEBUG nova.compute.provider_tree [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1876.308892] env[69648]: DEBUG nova.scheduler.client.report [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1876.330043] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.334s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1876.355866] env[69648]: INFO nova.scheduler.client.report [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Deleted allocations for instance 31c84a7e-7a41-4d9f-ad29-6dad6648d85f [ 1876.410615] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c22d5186-a6c2-424f-a71c-5bf8869592ce tempest-ServersTestBootFromVolume-972290934 tempest-ServersTestBootFromVolume-972290934-project-member] Lock "31c84a7e-7a41-4d9f-ad29-6dad6648d85f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.584s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1894.766029] env[69648]: WARNING oslo_vmware.rw_handles [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1894.766029] env[69648]: ERROR oslo_vmware.rw_handles [ 1894.766750] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/84324e3a-2109-418e-8741-c4d48c69cc10/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1894.768578] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1894.768826] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Copying Virtual Disk [datastore1] vmware_temp/84324e3a-2109-418e-8741-c4d48c69cc10/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/84324e3a-2109-418e-8741-c4d48c69cc10/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1894.769170] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-38d1f1ed-f17b-462e-8aff-978231329254 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1894.777095] env[69648]: DEBUG oslo_vmware.api [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for the task: (returnval){ [ 1894.777095] env[69648]: value = "task-3466664" [ 1894.777095] env[69648]: _type = "Task" [ 1894.777095] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1894.784601] env[69648]: DEBUG oslo_vmware.api [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': task-3466664, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1895.289029] env[69648]: DEBUG oslo_vmware.exceptions [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1895.289029] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1895.289029] env[69648]: ERROR nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1895.289029] env[69648]: Faults: ['InvalidArgument'] [ 1895.289029] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Traceback (most recent call last): [ 1895.289029] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1895.289029] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] yield resources [ 1895.289029] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1895.289029] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] self.driver.spawn(context, instance, image_meta, [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] self._fetch_image_if_missing(context, vi) [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] image_cache(vi, tmp_image_ds_loc) [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] vm_util.copy_virtual_disk( [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] session._wait_for_task(vmdk_copy_task) [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] return self.wait_for_task(task_ref) [ 1895.289770] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] return evt.wait() [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] result = hub.switch() [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] return self.greenlet.switch() [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] self.f(*self.args, **self.kw) [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] raise exceptions.translate_fault(task_info.error) [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Faults: ['InvalidArgument'] [ 1895.290315] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] [ 1895.290666] env[69648]: INFO nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Terminating instance [ 1895.290889] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1895.291113] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1895.291358] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fd596f4e-0943-4ef5-90fa-e1ad31c65c55 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.293501] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1895.293695] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1895.294431] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d94e3b9-ce1f-4cfb-8548-b70fb98af057 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.301291] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1895.301496] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-28b4815b-42ff-4e25-bfee-05607512eb2e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.303571] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1895.303749] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1895.304661] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d3fbde4f-abae-447a-aff9-859f96fa8288 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.309097] env[69648]: DEBUG oslo_vmware.api [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for the task: (returnval){ [ 1895.309097] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52286d44-bb1f-1f80-df30-65321850746a" [ 1895.309097] env[69648]: _type = "Task" [ 1895.309097] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1895.315982] env[69648]: DEBUG oslo_vmware.api [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52286d44-bb1f-1f80-df30-65321850746a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1895.371675] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1895.371924] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1895.372076] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Deleting the datastore file [datastore1] 590dbeb2-7e21-454f-93b5-97065c5bfdb0 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1895.372345] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-35cf7ede-c0bb-4f23-9c91-605cf775814c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.378359] env[69648]: DEBUG oslo_vmware.api [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for the task: (returnval){ [ 1895.378359] env[69648]: value = "task-3466666" [ 1895.378359] env[69648]: _type = "Task" [ 1895.378359] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1895.385700] env[69648]: DEBUG oslo_vmware.api [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': task-3466666, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1895.819794] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1895.820240] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Creating directory with path [datastore1] vmware_temp/ba85c98a-b822-4d14-a419-5363c8fc5c1a/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1895.820240] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d9f24cfd-27a8-467f-a83d-ed19d20d823c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.832075] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Created directory with path [datastore1] vmware_temp/ba85c98a-b822-4d14-a419-5363c8fc5c1a/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1895.832299] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Fetch image to [datastore1] vmware_temp/ba85c98a-b822-4d14-a419-5363c8fc5c1a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1895.832439] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/ba85c98a-b822-4d14-a419-5363c8fc5c1a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1895.833169] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db3957b4-c527-44ff-8a0c-3077ed0cc60e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.839751] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c125d67f-d421-46e7-aaf5-f488e2d18178 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.848620] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bcf44dd-4d4f-41c9-92f1-03d4e0ab851e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.882270] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9effa3fc-c31e-4274-836d-229efab88378 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.890497] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ec2dc8bb-53a0-40ee-9500-b621730aa687 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.892118] env[69648]: DEBUG oslo_vmware.api [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Task: {'id': task-3466666, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07628} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1895.892354] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1895.892532] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1895.892701] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1895.892872] env[69648]: INFO nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1895.894862] env[69648]: DEBUG nova.compute.claims [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1895.895038] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1895.895296] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1895.914050] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1895.967353] env[69648]: DEBUG oslo_vmware.rw_handles [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ba85c98a-b822-4d14-a419-5363c8fc5c1a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1896.026678] env[69648]: DEBUG oslo_vmware.rw_handles [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1896.026887] env[69648]: DEBUG oslo_vmware.rw_handles [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ba85c98a-b822-4d14-a419-5363c8fc5c1a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1896.139338] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e84a487-b126-4c95-81dc-a308bb4192f6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.147101] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f403c39e-4c84-4931-816b-4a0361a0cf73 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.176317] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcda55f7-3f32-45a2-acdf-ff057021f253 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.184686] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c61b7970-41d2-4244-92cb-aa3bf36bb5b6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.197769] env[69648]: DEBUG nova.compute.provider_tree [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1896.206124] env[69648]: DEBUG nova.scheduler.client.report [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1896.219726] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.324s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1896.220260] env[69648]: ERROR nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1896.220260] env[69648]: Faults: ['InvalidArgument'] [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Traceback (most recent call last): [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] self.driver.spawn(context, instance, image_meta, [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] self._fetch_image_if_missing(context, vi) [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] image_cache(vi, tmp_image_ds_loc) [ 1896.220260] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] vm_util.copy_virtual_disk( [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] session._wait_for_task(vmdk_copy_task) [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] return self.wait_for_task(task_ref) [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] return evt.wait() [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] result = hub.switch() [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] return self.greenlet.switch() [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1896.220666] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] self.f(*self.args, **self.kw) [ 1896.221106] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1896.221106] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] raise exceptions.translate_fault(task_info.error) [ 1896.221106] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1896.221106] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Faults: ['InvalidArgument'] [ 1896.221106] env[69648]: ERROR nova.compute.manager [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] [ 1896.221106] env[69648]: DEBUG nova.compute.utils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1896.222384] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Build of instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 was re-scheduled: A specified parameter was not correct: fileType [ 1896.222384] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1896.222794] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1896.222927] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1896.223092] env[69648]: DEBUG nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1896.223256] env[69648]: DEBUG nova.network.neutron [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1896.718285] env[69648]: DEBUG nova.network.neutron [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1896.730836] env[69648]: INFO nova.compute.manager [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Took 0.51 seconds to deallocate network for instance. [ 1896.817816] env[69648]: INFO nova.scheduler.client.report [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Deleted allocations for instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 [ 1896.837860] env[69648]: DEBUG oslo_concurrency.lockutils [None req-69c386f4-e25c-41f7-b07e-54faaa1fe164 tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 575.030s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1896.839365] env[69648]: DEBUG oslo_concurrency.lockutils [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 379.616s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1896.839365] env[69648]: DEBUG oslo_concurrency.lockutils [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Acquiring lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1896.839541] env[69648]: DEBUG oslo_concurrency.lockutils [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1896.839731] env[69648]: DEBUG oslo_concurrency.lockutils [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1896.842252] env[69648]: INFO nova.compute.manager [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Terminating instance [ 1896.844233] env[69648]: DEBUG nova.compute.manager [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1896.844500] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1896.845327] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-59a7aa74-9bd7-41db-a9ae-dd5c06366742 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.851325] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1896.858083] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49a1d549-607e-4df0-be51-d88804459b0c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.887363] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 590dbeb2-7e21-454f-93b5-97065c5bfdb0 could not be found. [ 1896.887537] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1896.887718] env[69648]: INFO nova.compute.manager [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1896.887963] env[69648]: DEBUG oslo.service.loopingcall [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1896.892303] env[69648]: DEBUG nova.compute.manager [-] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1896.892428] env[69648]: DEBUG nova.network.neutron [-] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1896.903939] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1896.904190] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1896.905672] env[69648]: INFO nova.compute.claims [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1896.919992] env[69648]: DEBUG nova.network.neutron [-] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1896.941920] env[69648]: INFO nova.compute.manager [-] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] Took 0.05 seconds to deallocate network for instance. [ 1897.029686] env[69648]: DEBUG oslo_concurrency.lockutils [None req-68152748-8006-458d-a0e8-ebc327a1d07f tempest-AttachVolumeShelveTestJSON-1883720761 tempest-AttachVolumeShelveTestJSON-1883720761-project-member] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.191s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1897.031040] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 102.016s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1897.031326] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 590dbeb2-7e21-454f-93b5-97065c5bfdb0] During sync_power_state the instance has a pending task (deleting). Skip. [ 1897.031562] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "590dbeb2-7e21-454f-93b5-97065c5bfdb0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1897.108338] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02afe56b-8d58-4543-8c30-f271367b97ac {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.116110] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-236c1ad0-fa6d-4d03-9b41-0352997203cf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.147308] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d5aee01-f4d1-4afe-b7c1-e29ae88a33e3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.154062] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06ddfd0b-6100-40df-b27e-fbc34b4e5f87 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.166646] env[69648]: DEBUG nova.compute.provider_tree [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1897.175773] env[69648]: DEBUG nova.scheduler.client.report [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1897.188720] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1897.189250] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1897.229814] env[69648]: DEBUG nova.compute.utils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1897.231376] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1897.231582] env[69648]: DEBUG nova.network.neutron [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1897.240868] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1897.302631] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1897.305891] env[69648]: DEBUG nova.policy [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'caf89555b5df4f5fa4cac41f6b1792db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca41677808a749f1b88e43a112db7fb2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1897.326333] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1897.326575] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1897.326729] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1897.326908] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1897.327091] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1897.327296] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1897.327475] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1897.327636] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1897.327804] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1897.327967] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1897.328154] env[69648]: DEBUG nova.virt.hardware [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1897.329017] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a45b293-05b2-4bc4-921d-ac09ee0f0244 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.336654] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15b07359-d341-4b24-9ece-d4d23a5d06e2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.625383] env[69648]: DEBUG nova.network.neutron [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Successfully created port: 1f0fd598-e3c9-4569-bcea-910dbdadd89e {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1898.470300] env[69648]: DEBUG nova.network.neutron [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Successfully updated port: 1f0fd598-e3c9-4569-bcea-910dbdadd89e {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1898.483149] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "refresh_cache-f13b5f54-2f87-4c7a-9751-4dc5b7762b83" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1898.483493] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired lock "refresh_cache-f13b5f54-2f87-4c7a-9751-4dc5b7762b83" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1898.483772] env[69648]: DEBUG nova.network.neutron [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1898.523394] env[69648]: DEBUG nova.network.neutron [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1898.701013] env[69648]: DEBUG nova.network.neutron [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Updating instance_info_cache with network_info: [{"id": "1f0fd598-e3c9-4569-bcea-910dbdadd89e", "address": "fa:16:3e:71:0c:3a", "network": {"id": "c571b585-f722-4160-a8c7-ee2a6eb60153", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-12468984-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca41677808a749f1b88e43a112db7fb2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f0fd598-e3", "ovs_interfaceid": "1f0fd598-e3c9-4569-bcea-910dbdadd89e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1898.712428] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Releasing lock "refresh_cache-f13b5f54-2f87-4c7a-9751-4dc5b7762b83" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1898.712735] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Instance network_info: |[{"id": "1f0fd598-e3c9-4569-bcea-910dbdadd89e", "address": "fa:16:3e:71:0c:3a", "network": {"id": "c571b585-f722-4160-a8c7-ee2a6eb60153", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-12468984-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca41677808a749f1b88e43a112db7fb2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f0fd598-e3", "ovs_interfaceid": "1f0fd598-e3c9-4569-bcea-910dbdadd89e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1898.713146] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:71:0c:3a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1a55f45a-d631-4ebc-b73b-8a30bd0a32a8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1f0fd598-e3c9-4569-bcea-910dbdadd89e', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1898.720664] env[69648]: DEBUG oslo.service.loopingcall [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1898.721153] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1898.721383] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8b44704c-de46-49bc-ad41-9e2be82a6287 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1898.741800] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1898.741800] env[69648]: value = "task-3466667" [ 1898.741800] env[69648]: _type = "Task" [ 1898.741800] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1898.749454] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466667, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1898.766429] env[69648]: DEBUG nova.compute.manager [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Received event network-vif-plugged-1f0fd598-e3c9-4569-bcea-910dbdadd89e {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1898.766662] env[69648]: DEBUG oslo_concurrency.lockutils [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] Acquiring lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1898.766803] env[69648]: DEBUG oslo_concurrency.lockutils [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] Lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1898.766975] env[69648]: DEBUG oslo_concurrency.lockutils [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] Lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1898.767159] env[69648]: DEBUG nova.compute.manager [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] No waiting events found dispatching network-vif-plugged-1f0fd598-e3c9-4569-bcea-910dbdadd89e {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1898.767326] env[69648]: WARNING nova.compute.manager [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Received unexpected event network-vif-plugged-1f0fd598-e3c9-4569-bcea-910dbdadd89e for instance with vm_state building and task_state spawning. [ 1898.767486] env[69648]: DEBUG nova.compute.manager [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Received event network-changed-1f0fd598-e3c9-4569-bcea-910dbdadd89e {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1898.767639] env[69648]: DEBUG nova.compute.manager [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Refreshing instance network info cache due to event network-changed-1f0fd598-e3c9-4569-bcea-910dbdadd89e. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1898.767817] env[69648]: DEBUG oslo_concurrency.lockutils [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] Acquiring lock "refresh_cache-f13b5f54-2f87-4c7a-9751-4dc5b7762b83" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1898.767952] env[69648]: DEBUG oslo_concurrency.lockutils [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] Acquired lock "refresh_cache-f13b5f54-2f87-4c7a-9751-4dc5b7762b83" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1898.768276] env[69648]: DEBUG nova.network.neutron [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Refreshing network info cache for port 1f0fd598-e3c9-4569-bcea-910dbdadd89e {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1899.026234] env[69648]: DEBUG nova.network.neutron [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Updated VIF entry in instance network info cache for port 1f0fd598-e3c9-4569-bcea-910dbdadd89e. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1899.026611] env[69648]: DEBUG nova.network.neutron [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Updating instance_info_cache with network_info: [{"id": "1f0fd598-e3c9-4569-bcea-910dbdadd89e", "address": "fa:16:3e:71:0c:3a", "network": {"id": "c571b585-f722-4160-a8c7-ee2a6eb60153", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-12468984-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca41677808a749f1b88e43a112db7fb2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f0fd598-e3", "ovs_interfaceid": "1f0fd598-e3c9-4569-bcea-910dbdadd89e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1899.035933] env[69648]: DEBUG oslo_concurrency.lockutils [req-1b3c7103-f1a6-41d0-ab59-d311d31a007b req-7d223e15-874e-4a2c-81a9-f2f440ffc4cc service nova] Releasing lock "refresh_cache-f13b5f54-2f87-4c7a-9751-4dc5b7762b83" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1899.252188] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466667, 'name': CreateVM_Task, 'duration_secs': 0.28732} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1899.252361] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1899.253028] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1899.253298] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1899.253624] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1899.253868] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ffab8d02-45fa-405a-abf7-ab28a5ffe85d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1899.258109] env[69648]: DEBUG oslo_vmware.api [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for the task: (returnval){ [ 1899.258109] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52610bff-52ac-8deb-53c3-f624c9282fd5" [ 1899.258109] env[69648]: _type = "Task" [ 1899.258109] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1899.265528] env[69648]: DEBUG oslo_vmware.api [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52610bff-52ac-8deb-53c3-f624c9282fd5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1899.768445] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1899.768715] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1899.768938] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1920.066214] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1920.066214] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1922.065201] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1922.065474] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1923.060627] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1927.065486] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1927.065908] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1928.066164] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1928.079636] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1928.080009] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1928.080306] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1928.080580] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1928.082266] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5d17812-28b6-4cab-a579-2d88a45d748d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.095240] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a99fd56-b60e-447b-ad71-eefc612f0fff {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.117946] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f57d144-4a70-47f3-a90d-e52dc1e9720d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.128153] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec0d9365-1a2c-4e59-9d43-66274a144d97 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.178650] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180956MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1928.178908] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1928.179246] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1928.259476] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.259748] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.259963] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.260208] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.260430] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 723972b1-3f91-4c59-b265-3975644dadb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.260645] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.260855] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.261077] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.261290] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.261501] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1928.274930] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1928.286875] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cc77a95f-ea00-4b01-96ac-8256672eeb39 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1928.287211] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1928.287432] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1928.454347] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfcb44df-b8bc-4985-8aaf-019738b67470 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.463351] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-893c6c8d-37b5-4b86-b8eb-e5db2ee0b272 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.496611] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f8bf18-faad-44b2-8984-7dd65ac2c947 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.504416] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcde8059-fca7-4e59-afc9-f8a161b30127 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.518394] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1928.530268] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1928.545166] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1928.545501] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1930.545998] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1932.065664] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1932.065962] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1932.066020] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1932.085334] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.085493] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.085651] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.085785] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.085912] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.086048] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.086175] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.086299] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.086421] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.086542] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1932.086671] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1944.783679] env[69648]: WARNING oslo_vmware.rw_handles [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1944.783679] env[69648]: ERROR oslo_vmware.rw_handles [ 1944.784575] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/ba85c98a-b822-4d14-a419-5363c8fc5c1a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1944.786221] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1944.786498] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Copying Virtual Disk [datastore1] vmware_temp/ba85c98a-b822-4d14-a419-5363c8fc5c1a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/ba85c98a-b822-4d14-a419-5363c8fc5c1a/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1944.786800] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-99bc7197-20c0-4570-aca2-126f90ace659 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.794761] env[69648]: DEBUG oslo_vmware.api [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for the task: (returnval){ [ 1944.794761] env[69648]: value = "task-3466668" [ 1944.794761] env[69648]: _type = "Task" [ 1944.794761] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1944.802624] env[69648]: DEBUG oslo_vmware.api [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Task: {'id': task-3466668, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1945.305273] env[69648]: DEBUG oslo_vmware.exceptions [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1945.305572] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1945.306150] env[69648]: ERROR nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1945.306150] env[69648]: Faults: ['InvalidArgument'] [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Traceback (most recent call last): [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] yield resources [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] self.driver.spawn(context, instance, image_meta, [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] self._fetch_image_if_missing(context, vi) [ 1945.306150] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] image_cache(vi, tmp_image_ds_loc) [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] vm_util.copy_virtual_disk( [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] session._wait_for_task(vmdk_copy_task) [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] return self.wait_for_task(task_ref) [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] return evt.wait() [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] result = hub.switch() [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1945.306574] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] return self.greenlet.switch() [ 1945.306982] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1945.306982] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] self.f(*self.args, **self.kw) [ 1945.306982] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1945.306982] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] raise exceptions.translate_fault(task_info.error) [ 1945.306982] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1945.306982] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Faults: ['InvalidArgument'] [ 1945.306982] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] [ 1945.306982] env[69648]: INFO nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Terminating instance [ 1945.308101] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1945.308328] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1945.308567] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8ddba182-0249-4071-8163-8a4ac520f711 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.310838] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1945.311044] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1945.311793] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab6ea1a0-686d-4f30-a2cd-1df94e9cbada {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.318386] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1945.318655] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8af79afa-a30f-44f4-94e0-88438119d142 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.320811] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1945.320988] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1945.321920] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e45d3f5a-9b85-4424-bc50-94b0b1e8f912 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.326311] env[69648]: DEBUG oslo_vmware.api [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for the task: (returnval){ [ 1945.326311] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5228361e-43f2-4129-3b6e-19166c34a1a5" [ 1945.326311] env[69648]: _type = "Task" [ 1945.326311] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1945.333641] env[69648]: DEBUG oslo_vmware.api [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5228361e-43f2-4129-3b6e-19166c34a1a5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1945.390576] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1945.390576] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1945.390576] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Deleting the datastore file [datastore1] 3dc3db1c-43c0-45e9-8283-38e77f66f06f {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1945.390862] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-31e436cb-eba9-4e7a-a191-2b561e1fa93c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.397075] env[69648]: DEBUG oslo_vmware.api [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for the task: (returnval){ [ 1945.397075] env[69648]: value = "task-3466670" [ 1945.397075] env[69648]: _type = "Task" [ 1945.397075] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1945.404650] env[69648]: DEBUG oslo_vmware.api [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Task: {'id': task-3466670, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1945.836675] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1945.836955] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Creating directory with path [datastore1] vmware_temp/dc21427c-ed90-44cc-9ee2-7b8a85dc4665/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1945.837195] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-411d66c5-338b-4cdb-91ca-225499f27f74 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.849090] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Created directory with path [datastore1] vmware_temp/dc21427c-ed90-44cc-9ee2-7b8a85dc4665/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1945.849287] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Fetch image to [datastore1] vmware_temp/dc21427c-ed90-44cc-9ee2-7b8a85dc4665/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1945.849487] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/dc21427c-ed90-44cc-9ee2-7b8a85dc4665/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1945.850195] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda6deb3-2df3-4e57-b3c8-21a997f60236 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.856527] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4582cf03-40f8-4d24-9c95-f601cf3b3911 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.866738] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac553b98-d7b4-42c6-8d89-351c0005eaa5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.897236] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed31b7d5-4414-48df-991e-4727a8e73080 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.907947] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-637639c8-fa01-49a2-9b3e-c6f6cdb1bae1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.909586] env[69648]: DEBUG oslo_vmware.api [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Task: {'id': task-3466670, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073404} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1945.909819] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1945.909998] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1945.910188] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1945.910364] env[69648]: INFO nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1945.912485] env[69648]: DEBUG nova.compute.claims [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1945.912653] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1945.912862] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1945.933751] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1945.984554] env[69648]: DEBUG oslo_vmware.rw_handles [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dc21427c-ed90-44cc-9ee2-7b8a85dc4665/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1946.044350] env[69648]: DEBUG oslo_vmware.rw_handles [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1946.044575] env[69648]: DEBUG oslo_vmware.rw_handles [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dc21427c-ed90-44cc-9ee2-7b8a85dc4665/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1946.149501] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97b49c76-0bc7-4308-8e04-79c4f1cb9607 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.157310] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-759d6aa9-8b60-4b06-83d3-1aa52c8d8d41 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.187661] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26783de4-8714-4d87-ac2e-67e71c4a79cb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.194311] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72f6f6a9-986a-4cc3-a72c-eff2967626cd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.206986] env[69648]: DEBUG nova.compute.provider_tree [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1946.215538] env[69648]: DEBUG nova.scheduler.client.report [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1946.228342] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.315s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.228840] env[69648]: ERROR nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1946.228840] env[69648]: Faults: ['InvalidArgument'] [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Traceback (most recent call last): [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] self.driver.spawn(context, instance, image_meta, [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] self._fetch_image_if_missing(context, vi) [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] image_cache(vi, tmp_image_ds_loc) [ 1946.228840] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] vm_util.copy_virtual_disk( [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] session._wait_for_task(vmdk_copy_task) [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] return self.wait_for_task(task_ref) [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] return evt.wait() [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] result = hub.switch() [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] return self.greenlet.switch() [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1946.229270] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] self.f(*self.args, **self.kw) [ 1946.229700] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1946.229700] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] raise exceptions.translate_fault(task_info.error) [ 1946.229700] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1946.229700] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Faults: ['InvalidArgument'] [ 1946.229700] env[69648]: ERROR nova.compute.manager [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] [ 1946.229700] env[69648]: DEBUG nova.compute.utils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1946.231200] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Build of instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f was re-scheduled: A specified parameter was not correct: fileType [ 1946.231200] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1946.231593] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1946.231770] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1946.231944] env[69648]: DEBUG nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1946.232124] env[69648]: DEBUG nova.network.neutron [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1946.578313] env[69648]: DEBUG nova.network.neutron [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1946.593551] env[69648]: INFO nova.compute.manager [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Took 0.36 seconds to deallocate network for instance. [ 1946.688028] env[69648]: INFO nova.scheduler.client.report [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Deleted allocations for instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f [ 1946.706600] env[69648]: DEBUG oslo_concurrency.lockutils [None req-7bbf9bae-15df-4c39-8698-7ceff21d367a tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 556.983s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.707951] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 360.822s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1946.708121] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1946.708336] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1946.708507] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.710937] env[69648]: INFO nova.compute.manager [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Terminating instance [ 1946.712630] env[69648]: DEBUG nova.compute.manager [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1946.712969] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1946.713490] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-47c6729e-61eb-46cc-b5a0-6ab313f60492 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.721089] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1946.726731] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b051b768-3839-4803-a286-155d3be5b635 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.755561] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3dc3db1c-43c0-45e9-8283-38e77f66f06f could not be found. [ 1946.755774] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1946.755950] env[69648]: INFO nova.compute.manager [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1946.756214] env[69648]: DEBUG oslo.service.loopingcall [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1946.760379] env[69648]: DEBUG nova.compute.manager [-] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1946.760508] env[69648]: DEBUG nova.network.neutron [-] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1946.772246] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1946.772482] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1946.773922] env[69648]: INFO nova.compute.claims [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1946.783023] env[69648]: DEBUG nova.network.neutron [-] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1946.795937] env[69648]: INFO nova.compute.manager [-] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] Took 0.04 seconds to deallocate network for instance. [ 1946.888464] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ecdbe21-149e-4b5c-b9ad-f04237567430 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.889326] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 151.874s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1946.889502] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3dc3db1c-43c0-45e9-8283-38e77f66f06f] During sync_power_state the instance has a pending task (deleting). Skip. [ 1946.889684] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "3dc3db1c-43c0-45e9-8283-38e77f66f06f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.967510] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ae8df9d-7722-4cf6-8a47-2fabecfc1ab8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.975730] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6703c942-0f61-4d86-b086-3bed76b69509 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.005156] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dfec30a-d0e5-439e-bac4-f7ca8cc88b73 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.011727] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2b2ff1c-cb1e-4b80-b8a2-f107b1b227fb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.024263] env[69648]: DEBUG nova.compute.provider_tree [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1947.032909] env[69648]: DEBUG nova.scheduler.client.report [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1947.046310] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1947.046746] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1947.080516] env[69648]: DEBUG nova.compute.utils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1947.081809] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1947.081989] env[69648]: DEBUG nova.network.neutron [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1947.091112] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1947.144995] env[69648]: DEBUG nova.policy [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dcce409aea2f4744bda144de55e46052', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd1ecc20de6ab4597a08d93cca45ed56c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1947.159897] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1947.186874] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1947.187191] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1947.187385] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1947.187566] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1947.187732] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1947.187955] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1947.188106] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1947.188270] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1947.188439] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1947.188605] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1947.188778] env[69648]: DEBUG nova.virt.hardware [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1947.189677] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cec64960-9efa-4327-9790-f226e305b9d4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.197675] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bd57229-38cc-4834-b503-58394f36e75e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.447752] env[69648]: DEBUG nova.network.neutron [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Successfully created port: 99445a6a-ea6e-48c2-a29f-8a3aff192f72 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1948.086155] env[69648]: DEBUG nova.network.neutron [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Successfully updated port: 99445a6a-ea6e-48c2-a29f-8a3aff192f72 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1948.104577] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "refresh_cache-cb6b7f04-1c44-4998-bd28-8a01c4b235e8" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1948.104755] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired lock "refresh_cache-cb6b7f04-1c44-4998-bd28-8a01c4b235e8" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1948.107027] env[69648]: DEBUG nova.network.neutron [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1948.372796] env[69648]: DEBUG nova.network.neutron [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1948.527212] env[69648]: DEBUG nova.network.neutron [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Updating instance_info_cache with network_info: [{"id": "99445a6a-ea6e-48c2-a29f-8a3aff192f72", "address": "fa:16:3e:38:94:27", "network": {"id": "fc9dfa48-a79d-4532-a3ed-e1ad779b2906", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-685310351-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d1ecc20de6ab4597a08d93cca45ed56c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee018eb-75be-4037-a80a-07034d4eae35", "external-id": "nsx-vlan-transportzone-8", "segmentation_id": 8, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99445a6a-ea", "ovs_interfaceid": "99445a6a-ea6e-48c2-a29f-8a3aff192f72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1948.541554] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Releasing lock "refresh_cache-cb6b7f04-1c44-4998-bd28-8a01c4b235e8" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1948.541895] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Instance network_info: |[{"id": "99445a6a-ea6e-48c2-a29f-8a3aff192f72", "address": "fa:16:3e:38:94:27", "network": {"id": "fc9dfa48-a79d-4532-a3ed-e1ad779b2906", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-685310351-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d1ecc20de6ab4597a08d93cca45ed56c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee018eb-75be-4037-a80a-07034d4eae35", "external-id": "nsx-vlan-transportzone-8", "segmentation_id": 8, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99445a6a-ea", "ovs_interfaceid": "99445a6a-ea6e-48c2-a29f-8a3aff192f72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1948.542319] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:38:94:27', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2ee018eb-75be-4037-a80a-07034d4eae35', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '99445a6a-ea6e-48c2-a29f-8a3aff192f72', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1948.552024] env[69648]: DEBUG oslo.service.loopingcall [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1948.552024] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1948.552024] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-69780d43-0fcf-43ea-8385-847d617a38cc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1948.572291] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1948.572291] env[69648]: value = "task-3466671" [ 1948.572291] env[69648]: _type = "Task" [ 1948.572291] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1948.580156] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466671, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1948.673130] env[69648]: DEBUG nova.compute.manager [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Received event network-vif-plugged-99445a6a-ea6e-48c2-a29f-8a3aff192f72 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1948.673377] env[69648]: DEBUG oslo_concurrency.lockutils [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] Acquiring lock "cb6b7f04-1c44-4998-bd28-8a01c4b235e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1948.673558] env[69648]: DEBUG oslo_concurrency.lockutils [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] Lock "cb6b7f04-1c44-4998-bd28-8a01c4b235e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1948.673726] env[69648]: DEBUG oslo_concurrency.lockutils [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] Lock "cb6b7f04-1c44-4998-bd28-8a01c4b235e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1948.673894] env[69648]: DEBUG nova.compute.manager [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] No waiting events found dispatching network-vif-plugged-99445a6a-ea6e-48c2-a29f-8a3aff192f72 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1948.674075] env[69648]: WARNING nova.compute.manager [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Received unexpected event network-vif-plugged-99445a6a-ea6e-48c2-a29f-8a3aff192f72 for instance with vm_state building and task_state spawning. [ 1948.674239] env[69648]: DEBUG nova.compute.manager [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Received event network-changed-99445a6a-ea6e-48c2-a29f-8a3aff192f72 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1948.674392] env[69648]: DEBUG nova.compute.manager [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Refreshing instance network info cache due to event network-changed-99445a6a-ea6e-48c2-a29f-8a3aff192f72. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1948.674609] env[69648]: DEBUG oslo_concurrency.lockutils [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] Acquiring lock "refresh_cache-cb6b7f04-1c44-4998-bd28-8a01c4b235e8" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1948.674704] env[69648]: DEBUG oslo_concurrency.lockutils [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] Acquired lock "refresh_cache-cb6b7f04-1c44-4998-bd28-8a01c4b235e8" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1948.674859] env[69648]: DEBUG nova.network.neutron [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Refreshing network info cache for port 99445a6a-ea6e-48c2-a29f-8a3aff192f72 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1948.926297] env[69648]: DEBUG nova.network.neutron [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Updated VIF entry in instance network info cache for port 99445a6a-ea6e-48c2-a29f-8a3aff192f72. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1948.926768] env[69648]: DEBUG nova.network.neutron [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Updating instance_info_cache with network_info: [{"id": "99445a6a-ea6e-48c2-a29f-8a3aff192f72", "address": "fa:16:3e:38:94:27", "network": {"id": "fc9dfa48-a79d-4532-a3ed-e1ad779b2906", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-685310351-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d1ecc20de6ab4597a08d93cca45ed56c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee018eb-75be-4037-a80a-07034d4eae35", "external-id": "nsx-vlan-transportzone-8", "segmentation_id": 8, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99445a6a-ea", "ovs_interfaceid": "99445a6a-ea6e-48c2-a29f-8a3aff192f72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1948.936520] env[69648]: DEBUG oslo_concurrency.lockutils [req-75d06c22-3ea7-407a-97b3-80df7470e3f4 req-66ac2b43-88a1-40fc-b6ae-2cac98a75020 service nova] Releasing lock "refresh_cache-cb6b7f04-1c44-4998-bd28-8a01c4b235e8" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1949.082122] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466671, 'name': CreateVM_Task, 'duration_secs': 0.303865} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1949.082276] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1949.082864] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1949.083038] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1949.083357] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1949.083596] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ee7250d2-e57d-4ec5-9a41-632a3eed01a6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1949.087793] env[69648]: DEBUG oslo_vmware.api [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for the task: (returnval){ [ 1949.087793] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528ded2f-b975-6a5e-6a22-c0d03a0ad7bf" [ 1949.087793] env[69648]: _type = "Task" [ 1949.087793] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1949.094890] env[69648]: DEBUG oslo_vmware.api [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528ded2f-b975-6a5e-6a22-c0d03a0ad7bf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1949.598552] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1949.598819] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1949.599048] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1979.309970] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1980.065380] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1981.066438] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1982.065616] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1983.062887] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1983.062887] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1983.090482] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1985.053163] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Acquiring lock "6f2b5030-4606-4873-a80b-186b841cc7dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1985.053163] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Lock "6f2b5030-4606-4873-a80b-186b841cc7dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1988.064915] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1988.077071] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1988.077322] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1988.077498] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1988.077708] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1988.079155] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a1ac446-d0c0-4ff0-b21f-075cdf8bc40e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1988.088949] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93134dfd-10fb-4c72-8c02-27113fc936f6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1988.103810] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf3c9923-5fe4-400f-be51-c20e1e90b88b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1988.110753] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e753bcc7-b282-4405-8159-4cf1a7f569fe {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1988.142300] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180974MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1988.142491] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1988.142653] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1988.217964] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.218198] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.218472] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.218540] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 723972b1-3f91-4c59-b265-3975644dadb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.218672] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.218794] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.218922] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.219060] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.219173] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.219285] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1988.231659] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cc77a95f-ea00-4b01-96ac-8256672eeb39 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1988.243402] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f2b5030-4606-4873-a80b-186b841cc7dd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1988.243637] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1988.243790] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1988.407597] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54eab3fb-fd7c-4512-9398-60d96b93841c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1988.415826] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3c9ac27-4819-4137-9136-d884dbae16f5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1988.446016] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1b85a0e-9123-4ee4-a458-036436eb071c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1988.453839] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19924327-5b9a-4def-965f-e4c631a7c9de {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1988.469182] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1988.478703] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1988.493553] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1988.493553] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.351s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1989.494188] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1989.494577] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1990.763548] env[69648]: DEBUG oslo_concurrency.lockutils [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1991.065774] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1993.935039] env[69648]: WARNING oslo_vmware.rw_handles [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1993.935039] env[69648]: ERROR oslo_vmware.rw_handles [ 1993.935039] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/dc21427c-ed90-44cc-9ee2-7b8a85dc4665/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1993.938180] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1993.938459] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Copying Virtual Disk [datastore1] vmware_temp/dc21427c-ed90-44cc-9ee2-7b8a85dc4665/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/dc21427c-ed90-44cc-9ee2-7b8a85dc4665/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1993.938768] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-41fcc97a-5ecd-4b2a-84e0-0df27b0087be {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.946254] env[69648]: DEBUG oslo_vmware.api [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for the task: (returnval){ [ 1993.946254] env[69648]: value = "task-3466672" [ 1993.946254] env[69648]: _type = "Task" [ 1993.946254] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1993.953961] env[69648]: DEBUG oslo_vmware.api [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': task-3466672, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1994.065683] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1994.065888] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1994.066029] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1994.095208] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.095364] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.095481] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.095609] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.095733] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.095857] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.095981] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.096170] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.096310] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.096430] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1994.096555] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1994.456043] env[69648]: DEBUG oslo_vmware.exceptions [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1994.456217] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1994.456705] env[69648]: ERROR nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1994.456705] env[69648]: Faults: ['InvalidArgument'] [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Traceback (most recent call last): [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] yield resources [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] self.driver.spawn(context, instance, image_meta, [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] self._fetch_image_if_missing(context, vi) [ 1994.456705] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] image_cache(vi, tmp_image_ds_loc) [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] vm_util.copy_virtual_disk( [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] session._wait_for_task(vmdk_copy_task) [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] return self.wait_for_task(task_ref) [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] return evt.wait() [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] result = hub.switch() [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1994.457322] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] return self.greenlet.switch() [ 1994.457750] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1994.457750] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] self.f(*self.args, **self.kw) [ 1994.457750] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1994.457750] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] raise exceptions.translate_fault(task_info.error) [ 1994.457750] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1994.457750] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Faults: ['InvalidArgument'] [ 1994.457750] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] [ 1994.457750] env[69648]: INFO nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Terminating instance [ 1994.458560] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1994.458781] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1994.459030] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6c0f9a68-c997-45ad-8095-c87bb55fa65b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1994.461335] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1994.461525] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1994.462263] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6108a8e-b57a-4dc0-a547-0f37a74381e2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1994.468281] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1994.468502] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3ec5a17d-2e9d-4d78-9399-caefcad1219d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1994.470530] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1994.470698] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1994.471671] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9438c29b-b2ff-40d0-ba84-8b0de4879068 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1994.476307] env[69648]: DEBUG oslo_vmware.api [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Waiting for the task: (returnval){ [ 1994.476307] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524584c9-6e8c-9409-2985-08cf0871ee52" [ 1994.476307] env[69648]: _type = "Task" [ 1994.476307] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1994.483875] env[69648]: DEBUG oslo_vmware.api [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524584c9-6e8c-9409-2985-08cf0871ee52, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1994.530425] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1994.530667] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1994.530803] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Deleting the datastore file [datastore1] 114bdafc-21f6-4a77-bf19-a444cbd8806c {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1994.531099] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fab85dad-9281-4a96-8834-71485e0c2263 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1994.537603] env[69648]: DEBUG oslo_vmware.api [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for the task: (returnval){ [ 1994.537603] env[69648]: value = "task-3466674" [ 1994.537603] env[69648]: _type = "Task" [ 1994.537603] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1994.544810] env[69648]: DEBUG oslo_vmware.api [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': task-3466674, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1994.986955] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1994.987305] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Creating directory with path [datastore1] vmware_temp/7b5cc4f1-f38d-4e93-96d6-dbbe9c2d98fc/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1994.987462] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c1bff504-5aa6-474a-99bc-2fcbe3c6d208 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1994.998443] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Created directory with path [datastore1] vmware_temp/7b5cc4f1-f38d-4e93-96d6-dbbe9c2d98fc/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1994.998657] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Fetch image to [datastore1] vmware_temp/7b5cc4f1-f38d-4e93-96d6-dbbe9c2d98fc/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1994.998798] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/7b5cc4f1-f38d-4e93-96d6-dbbe9c2d98fc/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1994.999507] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15151cc9-3278-4828-af98-0f043cc541f1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.006023] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ed025d3-7b09-492d-a742-2e29c79afb3a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.015907] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-028cb7b3-f195-4548-ac46-6d659e771b12 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.048503] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-361c632c-fae8-4b61-989f-2d63706409cd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.055125] env[69648]: DEBUG oslo_vmware.api [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': task-3466674, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072087} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1995.056513] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1995.056702] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1995.056875] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1995.057065] env[69648]: INFO nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1995.058767] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b9e5c642-f7ad-4dfa-97c1-d37648199eab {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.060586] env[69648]: DEBUG nova.compute.claims [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1995.060757] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1995.061016] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1995.081624] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1995.133624] env[69648]: DEBUG oslo_vmware.rw_handles [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b5cc4f1-f38d-4e93-96d6-dbbe9c2d98fc/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1995.195695] env[69648]: DEBUG oslo_vmware.rw_handles [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1995.195891] env[69648]: DEBUG oslo_vmware.rw_handles [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b5cc4f1-f38d-4e93-96d6-dbbe9c2d98fc/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1995.304837] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a17b233f-64e8-4f5b-9cba-4d785aabe9cc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.312534] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a3cd400-f377-4eea-b10d-a149c4acdc47 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.342926] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46aa4807-bdc1-413e-806c-12d7858ac412 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.350012] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73cb5969-2fca-47df-803f-1e6e1b32735e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.362742] env[69648]: DEBUG nova.compute.provider_tree [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1995.371746] env[69648]: DEBUG nova.scheduler.client.report [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1995.385007] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.324s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1995.385576] env[69648]: ERROR nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1995.385576] env[69648]: Faults: ['InvalidArgument'] [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Traceback (most recent call last): [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] self.driver.spawn(context, instance, image_meta, [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] self._fetch_image_if_missing(context, vi) [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] image_cache(vi, tmp_image_ds_loc) [ 1995.385576] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] vm_util.copy_virtual_disk( [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] session._wait_for_task(vmdk_copy_task) [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] return self.wait_for_task(task_ref) [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] return evt.wait() [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] result = hub.switch() [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] return self.greenlet.switch() [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1995.385961] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] self.f(*self.args, **self.kw) [ 1995.386346] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1995.386346] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] raise exceptions.translate_fault(task_info.error) [ 1995.386346] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1995.386346] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Faults: ['InvalidArgument'] [ 1995.386346] env[69648]: ERROR nova.compute.manager [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] [ 1995.386632] env[69648]: DEBUG nova.compute.utils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1995.388075] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Build of instance 114bdafc-21f6-4a77-bf19-a444cbd8806c was re-scheduled: A specified parameter was not correct: fileType [ 1995.388075] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1995.388522] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1995.388742] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1995.388960] env[69648]: DEBUG nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1995.389529] env[69648]: DEBUG nova.network.neutron [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1995.765421] env[69648]: DEBUG nova.network.neutron [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1995.778394] env[69648]: INFO nova.compute.manager [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Took 0.39 seconds to deallocate network for instance. [ 1995.874834] env[69648]: INFO nova.scheduler.client.report [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Deleted allocations for instance 114bdafc-21f6-4a77-bf19-a444cbd8806c [ 1995.898405] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b697a02b-381d-4b7d-b7df-e0593b5e1d37 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 594.222s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1995.900320] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 398.413s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1995.900320] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "114bdafc-21f6-4a77-bf19-a444cbd8806c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1995.900792] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1995.901053] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1995.904028] env[69648]: INFO nova.compute.manager [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Terminating instance [ 1995.905195] env[69648]: DEBUG nova.compute.manager [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1995.905455] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1995.905976] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c11c2629-1f40-489c-9e68-288df16d3369 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.916091] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a63ded89-d747-4959-a00f-f9cc8d2c0f1c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.926999] env[69648]: DEBUG nova.compute.manager [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1995.947096] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 114bdafc-21f6-4a77-bf19-a444cbd8806c could not be found. [ 1995.947303] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1995.947477] env[69648]: INFO nova.compute.manager [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1995.947711] env[69648]: DEBUG oslo.service.loopingcall [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1995.947967] env[69648]: DEBUG nova.compute.manager [-] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1995.948131] env[69648]: DEBUG nova.network.neutron [-] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1995.978016] env[69648]: DEBUG nova.network.neutron [-] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1995.979308] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1995.979543] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1995.981449] env[69648]: INFO nova.compute.claims [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1995.987252] env[69648]: INFO nova.compute.manager [-] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] Took 0.04 seconds to deallocate network for instance. [ 1996.071125] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9d276408-fe03-490c-a13d-26d45ef7a5c9 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.171s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1996.072361] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 201.056s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1996.072557] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 114bdafc-21f6-4a77-bf19-a444cbd8806c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1996.072859] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "114bdafc-21f6-4a77-bf19-a444cbd8806c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1996.177601] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e1702f1-9cac-4268-a10f-fa948dc87d05 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.186052] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b084c4c9-a2be-4f24-a078-ee83a74508db {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.216174] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-350cca4a-1723-4970-88db-011af874b2ac {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.223378] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5defb842-1926-4de8-89b7-9cf8f701aa05 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.236690] env[69648]: DEBUG nova.compute.provider_tree [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1996.247149] env[69648]: DEBUG nova.scheduler.client.report [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1996.261833] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1996.262340] env[69648]: DEBUG nova.compute.manager [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1996.298935] env[69648]: DEBUG nova.compute.utils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1996.300216] env[69648]: DEBUG nova.compute.manager [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1996.301190] env[69648]: DEBUG nova.network.neutron [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1996.309794] env[69648]: DEBUG nova.compute.manager [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1996.385343] env[69648]: DEBUG nova.policy [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bcf98a016ef4ac0ac9715c0feeed399', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33a61bab01e5406b931f6e77ff312517', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 1996.385343] env[69648]: DEBUG nova.compute.manager [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1996.403962] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1996.404223] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1996.404376] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1996.404591] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1996.404705] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1996.404857] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1996.405079] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1996.405239] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1996.405406] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1996.405572] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1996.405747] env[69648]: DEBUG nova.virt.hardware [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1996.406698] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a461c6e0-8a1c-4918-bf90-e33eb97f7ab9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.414706] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fabb8c9-4a2b-40a8-b9d7-59f630557f89 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.754957] env[69648]: DEBUG nova.network.neutron [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Successfully created port: 1f9652d3-9357-48b1-8092-b3f76b6d59cd {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1997.409671] env[69648]: DEBUG nova.network.neutron [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Successfully updated port: 1f9652d3-9357-48b1-8092-b3f76b6d59cd {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1997.425481] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "refresh_cache-cc77a95f-ea00-4b01-96ac-8256672eeb39" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1997.425628] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquired lock "refresh_cache-cc77a95f-ea00-4b01-96ac-8256672eeb39" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1997.425782] env[69648]: DEBUG nova.network.neutron [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1997.469056] env[69648]: DEBUG nova.network.neutron [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1997.635652] env[69648]: DEBUG nova.network.neutron [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Updating instance_info_cache with network_info: [{"id": "1f9652d3-9357-48b1-8092-b3f76b6d59cd", "address": "fa:16:3e:a7:fe:00", "network": {"id": "59ea0696-eb55-4ca5-b550-1565eaf8dd1e", "bridge": "br-int", "label": "tempest-ServersTestJSON-257736754-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "33a61bab01e5406b931f6e77ff312517", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0ded8bac-871f-491b-94ec-cb67c08bc828", "external-id": "nsx-vlan-transportzone-212", "segmentation_id": 212, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f9652d3-93", "ovs_interfaceid": "1f9652d3-9357-48b1-8092-b3f76b6d59cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1997.652217] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Releasing lock "refresh_cache-cc77a95f-ea00-4b01-96ac-8256672eeb39" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1997.652487] env[69648]: DEBUG nova.compute.manager [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Instance network_info: |[{"id": "1f9652d3-9357-48b1-8092-b3f76b6d59cd", "address": "fa:16:3e:a7:fe:00", "network": {"id": "59ea0696-eb55-4ca5-b550-1565eaf8dd1e", "bridge": "br-int", "label": "tempest-ServersTestJSON-257736754-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "33a61bab01e5406b931f6e77ff312517", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0ded8bac-871f-491b-94ec-cb67c08bc828", "external-id": "nsx-vlan-transportzone-212", "segmentation_id": 212, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f9652d3-93", "ovs_interfaceid": "1f9652d3-9357-48b1-8092-b3f76b6d59cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1997.652885] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a7:fe:00', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0ded8bac-871f-491b-94ec-cb67c08bc828', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1f9652d3-9357-48b1-8092-b3f76b6d59cd', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1997.660687] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Creating folder: Project (33a61bab01e5406b931f6e77ff312517). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1997.661302] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cd14d595-46d9-42ad-9413-e539d92a7791 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.671770] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Created folder: Project (33a61bab01e5406b931f6e77ff312517) in parent group-v692308. [ 1997.671966] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Creating folder: Instances. Parent ref: group-v692415. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1997.672208] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-be829374-ff49-4f16-bfdd-1e64f0d2e168 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.681347] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Created folder: Instances in parent group-v692415. [ 1997.681574] env[69648]: DEBUG oslo.service.loopingcall [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1997.681759] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1997.681960] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f5eca15e-87fc-42fa-a6fb-2e21a750e533 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.706665] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1997.706665] env[69648]: value = "task-3466677" [ 1997.706665] env[69648]: _type = "Task" [ 1997.706665] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1997.715409] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466677, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1997.803366] env[69648]: DEBUG nova.compute.manager [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Received event network-vif-plugged-1f9652d3-9357-48b1-8092-b3f76b6d59cd {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1997.803591] env[69648]: DEBUG oslo_concurrency.lockutils [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] Acquiring lock "cc77a95f-ea00-4b01-96ac-8256672eeb39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1997.803813] env[69648]: DEBUG oslo_concurrency.lockutils [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] Lock "cc77a95f-ea00-4b01-96ac-8256672eeb39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1997.803990] env[69648]: DEBUG oslo_concurrency.lockutils [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] Lock "cc77a95f-ea00-4b01-96ac-8256672eeb39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1997.804324] env[69648]: DEBUG nova.compute.manager [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] No waiting events found dispatching network-vif-plugged-1f9652d3-9357-48b1-8092-b3f76b6d59cd {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1997.804523] env[69648]: WARNING nova.compute.manager [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Received unexpected event network-vif-plugged-1f9652d3-9357-48b1-8092-b3f76b6d59cd for instance with vm_state building and task_state spawning. [ 1997.804698] env[69648]: DEBUG nova.compute.manager [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Received event network-changed-1f9652d3-9357-48b1-8092-b3f76b6d59cd {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1997.804860] env[69648]: DEBUG nova.compute.manager [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Refreshing instance network info cache due to event network-changed-1f9652d3-9357-48b1-8092-b3f76b6d59cd. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1997.805060] env[69648]: DEBUG oslo_concurrency.lockutils [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] Acquiring lock "refresh_cache-cc77a95f-ea00-4b01-96ac-8256672eeb39" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1997.805334] env[69648]: DEBUG oslo_concurrency.lockutils [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] Acquired lock "refresh_cache-cc77a95f-ea00-4b01-96ac-8256672eeb39" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1997.805394] env[69648]: DEBUG nova.network.neutron [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Refreshing network info cache for port 1f9652d3-9357-48b1-8092-b3f76b6d59cd {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1998.097539] env[69648]: DEBUG nova.network.neutron [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Updated VIF entry in instance network info cache for port 1f9652d3-9357-48b1-8092-b3f76b6d59cd. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1998.097900] env[69648]: DEBUG nova.network.neutron [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Updating instance_info_cache with network_info: [{"id": "1f9652d3-9357-48b1-8092-b3f76b6d59cd", "address": "fa:16:3e:a7:fe:00", "network": {"id": "59ea0696-eb55-4ca5-b550-1565eaf8dd1e", "bridge": "br-int", "label": "tempest-ServersTestJSON-257736754-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "33a61bab01e5406b931f6e77ff312517", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0ded8bac-871f-491b-94ec-cb67c08bc828", "external-id": "nsx-vlan-transportzone-212", "segmentation_id": 212, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f9652d3-93", "ovs_interfaceid": "1f9652d3-9357-48b1-8092-b3f76b6d59cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1998.107376] env[69648]: DEBUG oslo_concurrency.lockutils [req-a1e6b94c-9ccf-4aab-8fb3-38c675679660 req-ea16386d-6963-440d-9bae-b092bff5b203 service nova] Releasing lock "refresh_cache-cc77a95f-ea00-4b01-96ac-8256672eeb39" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1998.216875] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466677, 'name': CreateVM_Task, 'duration_secs': 0.317995} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1998.216986] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1998.217643] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1998.217833] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1998.218204] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1998.218454] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ca5386be-fb05-4333-b7e2-0def55fb9c17 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.222650] env[69648]: DEBUG oslo_vmware.api [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Waiting for the task: (returnval){ [ 1998.222650] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524c5000-a569-faa5-9262-d0e701797da2" [ 1998.222650] env[69648]: _type = "Task" [ 1998.222650] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1998.229832] env[69648]: DEBUG oslo_vmware.api [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]524c5000-a569-faa5-9262-d0e701797da2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1998.734730] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1998.734992] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1998.735262] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2013.800555] env[69648]: DEBUG oslo_concurrency.lockutils [None req-aa0e69b9-93f3-4042-aa18-012e60ec4c1b tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "cb6b7f04-1c44-4998-bd28-8a01c4b235e8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2030.275622] env[69648]: DEBUG oslo_concurrency.lockutils [None req-67ae13e4-7193-4452-90b4-292067acbe01 tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquiring lock "cc77a95f-ea00-4b01-96ac-8256672eeb39" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2040.065648] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2041.065565] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2042.065957] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2043.060858] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2043.064452] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2044.792054] env[69648]: WARNING oslo_vmware.rw_handles [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2044.792054] env[69648]: ERROR oslo_vmware.rw_handles [ 2044.792054] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/7b5cc4f1-f38d-4e93-96d6-dbbe9c2d98fc/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2044.794259] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2044.794507] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Copying Virtual Disk [datastore1] vmware_temp/7b5cc4f1-f38d-4e93-96d6-dbbe9c2d98fc/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/7b5cc4f1-f38d-4e93-96d6-dbbe9c2d98fc/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2044.794814] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-37eff53c-27d5-443c-a590-a2a07fb2fed4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.802837] env[69648]: DEBUG oslo_vmware.api [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Waiting for the task: (returnval){ [ 2044.802837] env[69648]: value = "task-3466678" [ 2044.802837] env[69648]: _type = "Task" [ 2044.802837] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2044.810935] env[69648]: DEBUG oslo_vmware.api [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Task: {'id': task-3466678, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2045.312955] env[69648]: DEBUG oslo_vmware.exceptions [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2045.314043] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2045.314043] env[69648]: ERROR nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2045.314043] env[69648]: Faults: ['InvalidArgument'] [ 2045.314043] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Traceback (most recent call last): [ 2045.314043] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2045.314043] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] yield resources [ 2045.314043] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2045.314043] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] self.driver.spawn(context, instance, image_meta, [ 2045.314043] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2045.314043] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] self._fetch_image_if_missing(context, vi) [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] image_cache(vi, tmp_image_ds_loc) [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] vm_util.copy_virtual_disk( [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] session._wait_for_task(vmdk_copy_task) [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] return self.wait_for_task(task_ref) [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] return evt.wait() [ 2045.314416] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] result = hub.switch() [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] return self.greenlet.switch() [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] self.f(*self.args, **self.kw) [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] raise exceptions.translate_fault(task_info.error) [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Faults: ['InvalidArgument'] [ 2045.314790] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] [ 2045.314790] env[69648]: INFO nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Terminating instance [ 2045.315965] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2045.316202] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2045.316445] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b038159e-67f3-4e20-999e-51f938739829 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.318541] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2045.318737] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2045.319471] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18eddbc0-0375-4f19-aeb5-e5b723ccc1f4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.325989] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2045.326220] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6133b4bf-2e9a-4461-ab53-87088415c293 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.328343] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2045.328518] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2045.329467] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f3ec6a32-1c18-4e23-af0d-28ef50e5bcbf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.333985] env[69648]: DEBUG oslo_vmware.api [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for the task: (returnval){ [ 2045.333985] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52ae275b-b274-294c-8e54-dc65929151a0" [ 2045.333985] env[69648]: _type = "Task" [ 2045.333985] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2045.347196] env[69648]: DEBUG oslo_vmware.api [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52ae275b-b274-294c-8e54-dc65929151a0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2045.387362] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2045.387594] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2045.387781] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Deleting the datastore file [datastore1] 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2045.388048] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5ce234ba-cc45-431a-8eba-5dd28b949e4b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.395133] env[69648]: DEBUG oslo_vmware.api [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Waiting for the task: (returnval){ [ 2045.395133] env[69648]: value = "task-3466680" [ 2045.395133] env[69648]: _type = "Task" [ 2045.395133] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2045.403618] env[69648]: DEBUG oslo_vmware.api [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Task: {'id': task-3466680, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2045.844427] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2045.844713] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Creating directory with path [datastore1] vmware_temp/30acdc52-49cb-4dd0-9b60-c07aea1aa6a0/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2045.844977] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-edf2055a-bdd8-48ac-b736-580f914abd95 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.857934] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Created directory with path [datastore1] vmware_temp/30acdc52-49cb-4dd0-9b60-c07aea1aa6a0/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2045.858162] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Fetch image to [datastore1] vmware_temp/30acdc52-49cb-4dd0-9b60-c07aea1aa6a0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2045.858341] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/30acdc52-49cb-4dd0-9b60-c07aea1aa6a0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2045.859126] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46bbe6f1-a98c-4bd4-ac4e-994299eb4f29 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.865998] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84077da6-e9a1-4eea-af63-c7ff95d8de94 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.875154] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-935fc84c-6ce7-47a1-946b-d33d009edfb6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.909439] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d3c11e5-749d-4e6f-95d9-474b59a99034 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.916982] env[69648]: DEBUG oslo_vmware.api [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Task: {'id': task-3466680, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079738} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2045.918470] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2045.918664] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2045.918912] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2045.919031] env[69648]: INFO nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2045.921107] env[69648]: DEBUG nova.compute.claims [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2045.921304] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2045.921529] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2045.924046] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e2de3367-0de8-44f0-89fb-1e306d20477a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.946585] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2046.000091] env[69648]: DEBUG oslo_vmware.rw_handles [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/30acdc52-49cb-4dd0-9b60-c07aea1aa6a0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2046.059981] env[69648]: DEBUG oslo_vmware.rw_handles [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2046.060196] env[69648]: DEBUG oslo_vmware.rw_handles [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/30acdc52-49cb-4dd0-9b60-c07aea1aa6a0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2046.079032] env[69648]: DEBUG nova.scheduler.client.report [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Refreshing inventories for resource provider d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2046.097202] env[69648]: DEBUG nova.scheduler.client.report [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Updating ProviderTree inventory for provider d38a352b-7808-44da-8216-792e96aadc88 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2046.097437] env[69648]: DEBUG nova.compute.provider_tree [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Updating inventory in ProviderTree for provider d38a352b-7808-44da-8216-792e96aadc88 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2046.109324] env[69648]: DEBUG nova.scheduler.client.report [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Refreshing aggregate associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, aggregates: None {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2046.132145] env[69648]: DEBUG nova.scheduler.client.report [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Refreshing trait associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2046.253793] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5078a65-b589-497b-b340-4edbf7d7d23f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2046.261516] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac291aaf-46e1-418b-bf1b-396d191757a2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2046.291940] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda2aa66-68a7-4b49-af26-9aa4cb18166c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2046.298786] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fcc3aaf-9010-4e64-9b63-810e07f206ef {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2046.311651] env[69648]: DEBUG nova.compute.provider_tree [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2046.319877] env[69648]: DEBUG nova.scheduler.client.report [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2046.333572] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.412s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2046.334128] env[69648]: ERROR nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2046.334128] env[69648]: Faults: ['InvalidArgument'] [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Traceback (most recent call last): [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] self.driver.spawn(context, instance, image_meta, [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] self._fetch_image_if_missing(context, vi) [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] image_cache(vi, tmp_image_ds_loc) [ 2046.334128] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] vm_util.copy_virtual_disk( [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] session._wait_for_task(vmdk_copy_task) [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] return self.wait_for_task(task_ref) [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] return evt.wait() [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] result = hub.switch() [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] return self.greenlet.switch() [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2046.334478] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] self.f(*self.args, **self.kw) [ 2046.334806] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2046.334806] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] raise exceptions.translate_fault(task_info.error) [ 2046.334806] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2046.334806] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Faults: ['InvalidArgument'] [ 2046.334806] env[69648]: ERROR nova.compute.manager [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] [ 2046.334806] env[69648]: DEBUG nova.compute.utils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2046.336142] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Build of instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b was re-scheduled: A specified parameter was not correct: fileType [ 2046.336142] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2046.336511] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2046.336686] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2046.336891] env[69648]: DEBUG nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2046.337116] env[69648]: DEBUG nova.network.neutron [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2046.747433] env[69648]: DEBUG nova.network.neutron [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2046.759479] env[69648]: INFO nova.compute.manager [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Took 0.42 seconds to deallocate network for instance. [ 2046.854796] env[69648]: INFO nova.scheduler.client.report [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Deleted allocations for instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b [ 2046.873669] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f0778c7b-27f8-4b0e-a4e8-8e9733a7b622 tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 624.395s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2046.875171] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.082s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2046.875401] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Acquiring lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2046.875615] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2046.875814] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2046.877909] env[69648]: INFO nova.compute.manager [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Terminating instance [ 2046.880454] env[69648]: DEBUG nova.compute.manager [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2046.880649] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2046.881427] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d7954dab-b0cf-4566-9566-17d9a1cc805e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2046.890808] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bab7d737-ef3f-4982-8116-2bee5bba1dc1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2046.901881] env[69648]: DEBUG nova.compute.manager [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2046.922621] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b could not be found. [ 2046.922827] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2046.923015] env[69648]: INFO nova.compute.manager [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2046.923257] env[69648]: DEBUG oslo.service.loopingcall [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2046.923474] env[69648]: DEBUG nova.compute.manager [-] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2046.923571] env[69648]: DEBUG nova.network.neutron [-] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2046.949173] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2046.949310] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2046.950769] env[69648]: INFO nova.compute.claims [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2046.957814] env[69648]: DEBUG nova.network.neutron [-] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2046.965537] env[69648]: INFO nova.compute.manager [-] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] Took 0.04 seconds to deallocate network for instance. [ 2047.068746] env[69648]: DEBUG oslo_concurrency.lockutils [None req-a7841927-e4b4-455f-a107-c6ed9241885c tempest-ServerActionsTestOtherB-147759986 tempest-ServerActionsTestOtherB-147759986-project-member] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.193s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2047.069563] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 252.054s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2047.069755] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0b76d29b-a8b2-4b65-88b7-a6e5df9a602b] During sync_power_state the instance has a pending task (deleting). Skip. [ 2047.069918] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "0b76d29b-a8b2-4b65-88b7-a6e5df9a602b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2047.157032] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cd54219-2e2f-4530-a122-7456f80e9531 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.164942] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcd81b39-aab2-40fc-bdfd-51c9630d59c0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.193975] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f3fe6c6-1c40-44c4-b9b8-f42852c81c74 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.201095] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b4126f2-20fa-4d73-a05d-3959adac610c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.214060] env[69648]: DEBUG nova.compute.provider_tree [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2047.222718] env[69648]: DEBUG nova.scheduler.client.report [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2047.235769] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2047.236248] env[69648]: DEBUG nova.compute.manager [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2047.271528] env[69648]: DEBUG nova.compute.utils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2047.273017] env[69648]: DEBUG nova.compute.manager [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2047.273208] env[69648]: DEBUG nova.network.neutron [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2047.280882] env[69648]: DEBUG nova.compute.manager [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2047.335829] env[69648]: DEBUG nova.policy [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c2a70d0ddb684eec981344383b24d2fc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a8642682866147d6a4b04d5a31fd96f5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 2047.342372] env[69648]: DEBUG nova.compute.manager [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2047.366909] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2047.367167] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2047.367328] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2047.367515] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2047.367662] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2047.367809] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2047.368032] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2047.368261] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2047.368445] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2047.368610] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2047.368776] env[69648]: DEBUG nova.virt.hardware [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2047.369634] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d989e77-0c7c-4a47-95b4-56bb0aee918c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.377903] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bfb6df3-2c67-4b40-9580-c1054cb1e745 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.620977] env[69648]: DEBUG nova.network.neutron [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Successfully created port: d937e990-5e06-40ed-984e-39854b717b47 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2048.418930] env[69648]: DEBUG nova.network.neutron [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Successfully updated port: d937e990-5e06-40ed-984e-39854b717b47 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2048.432952] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Acquiring lock "refresh_cache-6f2b5030-4606-4873-a80b-186b841cc7dd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2048.433751] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Acquired lock "refresh_cache-6f2b5030-4606-4873-a80b-186b841cc7dd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2048.433751] env[69648]: DEBUG nova.network.neutron [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2048.479261] env[69648]: DEBUG nova.network.neutron [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2048.651610] env[69648]: DEBUG nova.network.neutron [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Updating instance_info_cache with network_info: [{"id": "d937e990-5e06-40ed-984e-39854b717b47", "address": "fa:16:3e:73:80:d6", "network": {"id": "b83c7471-3d67-499c-a310-5a62071f72af", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1533472057-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a8642682866147d6a4b04d5a31fd96f5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "84aee122-f630-43c5-9cc1-3a38d3819c82", "external-id": "nsx-vlan-transportzone-816", "segmentation_id": 816, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd937e990-5e", "ovs_interfaceid": "d937e990-5e06-40ed-984e-39854b717b47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2048.665976] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Releasing lock "refresh_cache-6f2b5030-4606-4873-a80b-186b841cc7dd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2048.666307] env[69648]: DEBUG nova.compute.manager [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Instance network_info: |[{"id": "d937e990-5e06-40ed-984e-39854b717b47", "address": "fa:16:3e:73:80:d6", "network": {"id": "b83c7471-3d67-499c-a310-5a62071f72af", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1533472057-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a8642682866147d6a4b04d5a31fd96f5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "84aee122-f630-43c5-9cc1-3a38d3819c82", "external-id": "nsx-vlan-transportzone-816", "segmentation_id": 816, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd937e990-5e", "ovs_interfaceid": "d937e990-5e06-40ed-984e-39854b717b47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2048.666705] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:73:80:d6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '84aee122-f630-43c5-9cc1-3a38d3819c82', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd937e990-5e06-40ed-984e-39854b717b47', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2048.674689] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Creating folder: Project (a8642682866147d6a4b04d5a31fd96f5). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2048.675307] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ea76bf70-506b-47a5-b4ac-3cbe9636166d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.686645] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Created folder: Project (a8642682866147d6a4b04d5a31fd96f5) in parent group-v692308. [ 2048.686836] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Creating folder: Instances. Parent ref: group-v692418. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2048.687095] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d0b5f343-9851-4730-8580-94d627db76e2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.696490] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Created folder: Instances in parent group-v692418. [ 2048.696730] env[69648]: DEBUG oslo.service.loopingcall [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2048.696922] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2048.697156] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b9a9fb7f-126b-453a-af93-7732f1c7118b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.716620] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2048.716620] env[69648]: value = "task-3466683" [ 2048.716620] env[69648]: _type = "Task" [ 2048.716620] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2048.724628] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466683, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2048.786039] env[69648]: DEBUG nova.compute.manager [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Received event network-vif-plugged-d937e990-5e06-40ed-984e-39854b717b47 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2048.786039] env[69648]: DEBUG oslo_concurrency.lockutils [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] Acquiring lock "6f2b5030-4606-4873-a80b-186b841cc7dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2048.786265] env[69648]: DEBUG oslo_concurrency.lockutils [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] Lock "6f2b5030-4606-4873-a80b-186b841cc7dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2048.786434] env[69648]: DEBUG oslo_concurrency.lockutils [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] Lock "6f2b5030-4606-4873-a80b-186b841cc7dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2048.786600] env[69648]: DEBUG nova.compute.manager [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] No waiting events found dispatching network-vif-plugged-d937e990-5e06-40ed-984e-39854b717b47 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2048.786765] env[69648]: WARNING nova.compute.manager [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Received unexpected event network-vif-plugged-d937e990-5e06-40ed-984e-39854b717b47 for instance with vm_state building and task_state spawning. [ 2048.786926] env[69648]: DEBUG nova.compute.manager [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Received event network-changed-d937e990-5e06-40ed-984e-39854b717b47 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2048.787090] env[69648]: DEBUG nova.compute.manager [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Refreshing instance network info cache due to event network-changed-d937e990-5e06-40ed-984e-39854b717b47. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 2048.787274] env[69648]: DEBUG oslo_concurrency.lockutils [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] Acquiring lock "refresh_cache-6f2b5030-4606-4873-a80b-186b841cc7dd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2048.787415] env[69648]: DEBUG oslo_concurrency.lockutils [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] Acquired lock "refresh_cache-6f2b5030-4606-4873-a80b-186b841cc7dd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2048.787575] env[69648]: DEBUG nova.network.neutron [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Refreshing network info cache for port d937e990-5e06-40ed-984e-39854b717b47 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2049.064570] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2049.064791] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2049.106477] env[69648]: DEBUG nova.network.neutron [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Updated VIF entry in instance network info cache for port d937e990-5e06-40ed-984e-39854b717b47. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2049.106831] env[69648]: DEBUG nova.network.neutron [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Updating instance_info_cache with network_info: [{"id": "d937e990-5e06-40ed-984e-39854b717b47", "address": "fa:16:3e:73:80:d6", "network": {"id": "b83c7471-3d67-499c-a310-5a62071f72af", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1533472057-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a8642682866147d6a4b04d5a31fd96f5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "84aee122-f630-43c5-9cc1-3a38d3819c82", "external-id": "nsx-vlan-transportzone-816", "segmentation_id": 816, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd937e990-5e", "ovs_interfaceid": "d937e990-5e06-40ed-984e-39854b717b47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2049.116519] env[69648]: DEBUG oslo_concurrency.lockutils [req-61455155-1afb-4fdd-89f0-d60ed26f08ac req-158beca1-6d17-4eba-aa0a-26559504fe73 service nova] Releasing lock "refresh_cache-6f2b5030-4606-4873-a80b-186b841cc7dd" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2049.230241] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466683, 'name': CreateVM_Task, 'duration_secs': 0.336362} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2049.230495] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2049.231505] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2049.231787] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2049.232277] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2049.232651] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b2cf704f-2161-44ce-aae1-5a83d5c3296a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.238722] env[69648]: DEBUG oslo_vmware.api [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Waiting for the task: (returnval){ [ 2049.238722] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5210ce2d-7680-07d2-ebea-712a8b7b6a78" [ 2049.238722] env[69648]: _type = "Task" [ 2049.238722] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2049.252989] env[69648]: DEBUG oslo_vmware.api [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5210ce2d-7680-07d2-ebea-712a8b7b6a78, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2049.749507] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2049.749905] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2049.750054] env[69648]: DEBUG oslo_concurrency.lockutils [None req-ea2bc78b-6c48-49c3-87d9-e2999783b1d7 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2050.065668] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2050.076992] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2050.077240] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2050.077414] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2050.077575] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2050.078780] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12bb994a-6c3a-49b0-81a0-6c897c91b360 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2050.087540] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d0c6a2e-2475-4435-a63b-e712d5e0e963 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2050.101571] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c83cadf-8aec-472e-b9c0-f917937c5f67 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2050.107609] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-393fa6ff-d683-4a7a-9007-dc6b002db5b9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2050.135696] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180966MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2050.135853] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2050.136064] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2050.207856] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.208041] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 723972b1-3f91-4c59-b265-3975644dadb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.208182] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.208316] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.208434] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.208554] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.208672] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.208787] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.208904] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cc77a95f-ea00-4b01-96ac-8256672eeb39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.209027] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f2b5030-4606-4873-a80b-186b841cc7dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2050.209218] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2050.209359] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2050.321280] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60c70eec-c2f1-47be-80ba-4d77e63f3484 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2050.328855] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7e1d646-c771-4f1a-9639-2ba30838ef9d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2050.360187] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddc18e0d-4bbf-43ce-8580-a0d49cbccd9e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2050.367116] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf283ee3-c7d7-41fc-af77-9b21ce214138 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2050.380316] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2050.388816] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2050.401959] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2050.402176] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.266s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2052.402637] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2053.065107] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2053.065310] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances with incomplete migration {{(pid=69648) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 2055.073452] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2055.073811] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2055.073811] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2055.093233] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.093406] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.093506] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.093636] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.093762] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.093886] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.094018] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.094146] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.094266] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.094385] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2055.094511] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2056.065372] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2056.065572] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 2056.080230] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] There are 1 instances to clean {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 2056.080575] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 31c84a7e-7a41-4d9f-ad29-6dad6648d85f] Instance has had 0 of 5 cleanup attempts {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11217}} [ 2065.066971] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2094.224889] env[69648]: WARNING oslo_vmware.rw_handles [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2094.224889] env[69648]: ERROR oslo_vmware.rw_handles [ 2094.225473] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/30acdc52-49cb-4dd0-9b60-c07aea1aa6a0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2094.227263] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2094.227522] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Copying Virtual Disk [datastore1] vmware_temp/30acdc52-49cb-4dd0-9b60-c07aea1aa6a0/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/30acdc52-49cb-4dd0-9b60-c07aea1aa6a0/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2094.227792] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4c321042-1744-40e5-add9-b62db8dde762 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.235879] env[69648]: DEBUG oslo_vmware.api [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for the task: (returnval){ [ 2094.235879] env[69648]: value = "task-3466684" [ 2094.235879] env[69648]: _type = "Task" [ 2094.235879] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2094.243546] env[69648]: DEBUG oslo_vmware.api [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': task-3466684, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2094.747114] env[69648]: DEBUG oslo_vmware.exceptions [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2094.747114] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2094.747382] env[69648]: ERROR nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2094.747382] env[69648]: Faults: ['InvalidArgument'] [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Traceback (most recent call last): [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] yield resources [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] self.driver.spawn(context, instance, image_meta, [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] self._fetch_image_if_missing(context, vi) [ 2094.747382] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] image_cache(vi, tmp_image_ds_loc) [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] vm_util.copy_virtual_disk( [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] session._wait_for_task(vmdk_copy_task) [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] return self.wait_for_task(task_ref) [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] return evt.wait() [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] result = hub.switch() [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2094.747754] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] return self.greenlet.switch() [ 2094.748059] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2094.748059] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] self.f(*self.args, **self.kw) [ 2094.748059] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2094.748059] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] raise exceptions.translate_fault(task_info.error) [ 2094.748059] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2094.748059] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Faults: ['InvalidArgument'] [ 2094.748059] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] [ 2094.748059] env[69648]: INFO nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Terminating instance [ 2094.749255] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2094.749469] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2094.749706] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-47fd2edf-e497-4d4b-a868-c774a2166bde {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.752060] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2094.752287] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2094.752983] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b884e2d-30d9-463e-acc6-bb19a68a8286 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.759534] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2094.759730] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9c7ce7d6-dda1-4146-9b22-2435eb2c7cad {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.761918] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2094.762125] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2094.763043] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-54a33767-0d5a-46e3-8b82-bc7cc95ee8a8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.767620] env[69648]: DEBUG oslo_vmware.api [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 2094.767620] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f2ccd5-dc54-5d7a-808c-cda0dca42ad5" [ 2094.767620] env[69648]: _type = "Task" [ 2094.767620] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2094.782193] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2094.782447] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating directory with path [datastore1] vmware_temp/b5f70ee2-bce2-4fe1-aec2-2cef8ebe3fbe/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2094.783082] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-34e67b77-71b0-4da1-9483-7f55506ad468 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.795196] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Created directory with path [datastore1] vmware_temp/b5f70ee2-bce2-4fe1-aec2-2cef8ebe3fbe/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2094.796039] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Fetch image to [datastore1] vmware_temp/b5f70ee2-bce2-4fe1-aec2-2cef8ebe3fbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2094.796039] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/b5f70ee2-bce2-4fe1-aec2-2cef8ebe3fbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2094.796300] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56dccb95-794f-4f05-b48b-5e59dbf05e4b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.803147] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62770037-baca-460e-985a-1b7c23716146 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.812098] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-160a6a86-2276-442a-83b3-33b86bbbd495 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.844617] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ac0d028-409e-4388-a63b-3c4495f0bb70 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.847123] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2094.847343] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2094.847499] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Deleting the datastore file [datastore1] bde8a72e-0ed5-4794-badf-0bc54c4c408b {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2094.847721] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-db0728f3-057f-46d0-aabd-3543ed552962 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.853044] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f9345e56-8818-4dcd-925a-db7d00d6bbb9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2094.855286] env[69648]: DEBUG oslo_vmware.api [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for the task: (returnval){ [ 2094.855286] env[69648]: value = "task-3466686" [ 2094.855286] env[69648]: _type = "Task" [ 2094.855286] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2094.862752] env[69648]: DEBUG oslo_vmware.api [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': task-3466686, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2094.873914] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2095.010286] env[69648]: DEBUG oslo_vmware.rw_handles [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b5f70ee2-bce2-4fe1-aec2-2cef8ebe3fbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2095.071902] env[69648]: DEBUG oslo_vmware.rw_handles [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2095.072144] env[69648]: DEBUG oslo_vmware.rw_handles [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b5f70ee2-bce2-4fe1-aec2-2cef8ebe3fbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2095.365992] env[69648]: DEBUG oslo_vmware.api [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': task-3466686, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07308} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2095.366323] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2095.366457] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2095.366630] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2095.366802] env[69648]: INFO nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2095.368815] env[69648]: DEBUG nova.compute.claims [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2095.368995] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2095.369226] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2095.529721] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9797e57b-ed13-4112-80b1-2bf28ce44d13 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.536937] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8551c756-7f9e-4dff-bd85-4d3cccb31e99 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.566250] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bafb417-6837-4b59-86b0-423bc23bfe09 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.572869] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8e434ce-6e6c-4f16-b669-e115a7de0429 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.585414] env[69648]: DEBUG nova.compute.provider_tree [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2095.593674] env[69648]: DEBUG nova.scheduler.client.report [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2095.606931] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.238s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2095.607463] env[69648]: ERROR nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2095.607463] env[69648]: Faults: ['InvalidArgument'] [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Traceback (most recent call last): [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] self.driver.spawn(context, instance, image_meta, [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] self._fetch_image_if_missing(context, vi) [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] image_cache(vi, tmp_image_ds_loc) [ 2095.607463] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] vm_util.copy_virtual_disk( [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] session._wait_for_task(vmdk_copy_task) [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] return self.wait_for_task(task_ref) [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] return evt.wait() [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] result = hub.switch() [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] return self.greenlet.switch() [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2095.607751] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] self.f(*self.args, **self.kw) [ 2095.608056] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2095.608056] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] raise exceptions.translate_fault(task_info.error) [ 2095.608056] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2095.608056] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Faults: ['InvalidArgument'] [ 2095.608056] env[69648]: ERROR nova.compute.manager [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] [ 2095.608197] env[69648]: DEBUG nova.compute.utils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2095.609511] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Build of instance bde8a72e-0ed5-4794-badf-0bc54c4c408b was re-scheduled: A specified parameter was not correct: fileType [ 2095.609511] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2095.609882] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2095.610069] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2095.610276] env[69648]: DEBUG nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2095.610449] env[69648]: DEBUG nova.network.neutron [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2095.914016] env[69648]: DEBUG nova.network.neutron [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2095.931418] env[69648]: INFO nova.compute.manager [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Took 0.32 seconds to deallocate network for instance. [ 2096.040342] env[69648]: INFO nova.scheduler.client.report [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Deleted allocations for instance bde8a72e-0ed5-4794-badf-0bc54c4c408b [ 2096.066075] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dc6069aa-918b-4189-b661-5782c6e803bd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 672.453s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2096.066075] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 475.900s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2096.066075] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2096.066332] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2096.066332] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2096.067671] env[69648]: INFO nova.compute.manager [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Terminating instance [ 2096.069813] env[69648]: DEBUG nova.compute.manager [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2096.070029] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2096.070524] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7fb0ae3f-4d0c-4fd5-a62f-d342a89548b5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.081289] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca697f8b-d595-448d-9dec-bc8ad4160939 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.111943] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bde8a72e-0ed5-4794-badf-0bc54c4c408b could not be found. [ 2096.112354] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2096.112524] env[69648]: INFO nova.compute.manager [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2096.112854] env[69648]: DEBUG oslo.service.loopingcall [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2096.113145] env[69648]: DEBUG nova.compute.manager [-] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2096.113260] env[69648]: DEBUG nova.network.neutron [-] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2096.149372] env[69648]: DEBUG nova.network.neutron [-] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2096.157326] env[69648]: INFO nova.compute.manager [-] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] Took 0.04 seconds to deallocate network for instance. [ 2096.251530] env[69648]: DEBUG oslo_concurrency.lockutils [None req-4515364b-f105-474f-b741-305f77aabcdd tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.187s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2096.252805] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 301.236s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2096.252805] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: bde8a72e-0ed5-4794-badf-0bc54c4c408b] During sync_power_state the instance has a pending task (deleting). Skip. [ 2096.252805] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "bde8a72e-0ed5-4794-badf-0bc54c4c408b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2101.076209] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2103.065638] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2103.065966] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2104.060851] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2104.080736] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2105.079744] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2110.066113] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2110.066469] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2112.065326] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2112.077543] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2112.077784] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2112.077955] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2112.078137] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2112.079337] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e236901b-f0d9-47cc-b9c0-3163e15dee4a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.089206] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f35828a-e1d0-4146-9cd5-90086c7cfd46 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.102962] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8ffed7c-44d0-4f6f-b594-ce4896742f02 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.108973] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4106d1c7-8c8b-404f-a4f9-db27408f89c9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.137679] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180955MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2112.137788] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2112.137980] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2112.203843] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 723972b1-3f91-4c59-b265-3975644dadb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2112.204015] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2112.204156] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2112.204284] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2112.204406] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2112.204524] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2112.204640] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2112.204756] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cc77a95f-ea00-4b01-96ac-8256672eeb39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2112.204871] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f2b5030-4606-4873-a80b-186b841cc7dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2112.205065] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2112.205209] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2112.309872] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-166b60f8-ce69-4477-98f8-c7681668cb5c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.318380] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94e3a699-adc5-4ca2-9347-806b0c7f3294 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.349418] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-741f4ebf-b14f-4705-b410-28864442c9e6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.356753] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b192c34d-47da-4471-9440-d7ed77008e65 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.370145] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2112.378738] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2112.394875] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2112.395084] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.257s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2114.396061] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2115.065916] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2115.066127] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2115.066255] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2115.085940] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2115.086107] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2115.086246] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2115.086379] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2115.086504] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2115.086671] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2115.086757] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2115.086879] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2115.087000] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2115.087139] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2141.944894] env[69648]: WARNING oslo_vmware.rw_handles [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2141.944894] env[69648]: ERROR oslo_vmware.rw_handles [ 2141.945470] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/b5f70ee2-bce2-4fe1-aec2-2cef8ebe3fbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2141.947400] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2141.947660] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Copying Virtual Disk [datastore1] vmware_temp/b5f70ee2-bce2-4fe1-aec2-2cef8ebe3fbe/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/b5f70ee2-bce2-4fe1-aec2-2cef8ebe3fbe/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2141.947966] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-caae1b04-f5ab-4442-9ce2-768e66dc5b94 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.956868] env[69648]: DEBUG oslo_vmware.api [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 2141.956868] env[69648]: value = "task-3466687" [ 2141.956868] env[69648]: _type = "Task" [ 2141.956868] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2141.964285] env[69648]: DEBUG oslo_vmware.api [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': task-3466687, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2142.467814] env[69648]: DEBUG oslo_vmware.exceptions [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2142.468115] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2142.468702] env[69648]: ERROR nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2142.468702] env[69648]: Faults: ['InvalidArgument'] [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Traceback (most recent call last): [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] yield resources [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] self.driver.spawn(context, instance, image_meta, [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] self._fetch_image_if_missing(context, vi) [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2142.468702] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] image_cache(vi, tmp_image_ds_loc) [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] vm_util.copy_virtual_disk( [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] session._wait_for_task(vmdk_copy_task) [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] return self.wait_for_task(task_ref) [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] return evt.wait() [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] result = hub.switch() [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] return self.greenlet.switch() [ 2142.469060] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2142.469385] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] self.f(*self.args, **self.kw) [ 2142.469385] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2142.469385] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] raise exceptions.translate_fault(task_info.error) [ 2142.469385] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2142.469385] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Faults: ['InvalidArgument'] [ 2142.469385] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] [ 2142.469385] env[69648]: INFO nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Terminating instance [ 2142.470537] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2142.470750] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2142.470987] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9cc85d24-cc90-4982-bbb5-405cec56d933 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2142.473275] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2142.473473] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2142.474257] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f679f5-b748-4dd9-b664-2b39dcf13e9a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2142.481020] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2142.481241] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c8d6efc6-abc2-4cc9-a295-c9e106763cab {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2142.483328] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2142.483491] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2142.484431] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bd347a1b-7c88-438d-8d56-aa8afe510ec5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2142.489352] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 2142.489352] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]526c4729-5755-c830-3bbb-38361d925bf7" [ 2142.489352] env[69648]: _type = "Task" [ 2142.489352] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2142.496380] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]526c4729-5755-c830-3bbb-38361d925bf7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2142.547500] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2142.547716] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2142.547872] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Deleting the datastore file [datastore1] 723972b1-3f91-4c59-b265-3975644dadb2 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2142.548149] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6d4cb37b-7f24-4ede-a1b2-d9fbea8e08ef {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2142.554126] env[69648]: DEBUG oslo_vmware.api [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for the task: (returnval){ [ 2142.554126] env[69648]: value = "task-3466689" [ 2142.554126] env[69648]: _type = "Task" [ 2142.554126] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2142.562379] env[69648]: DEBUG oslo_vmware.api [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': task-3466689, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2142.999310] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2142.999643] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating directory with path [datastore1] vmware_temp/0278433c-efa4-42a1-9b22-af6f8c205f9a/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2142.999833] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-03848c6e-b144-4726-92e8-4715725e3b3b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.013691] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Created directory with path [datastore1] vmware_temp/0278433c-efa4-42a1-9b22-af6f8c205f9a/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2143.013889] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Fetch image to [datastore1] vmware_temp/0278433c-efa4-42a1-9b22-af6f8c205f9a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2143.014076] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/0278433c-efa4-42a1-9b22-af6f8c205f9a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2143.014823] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37d809eb-1278-4b35-b8d0-ac0c563fdab2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.021696] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34569da0-7001-4348-aa52-5f8430bc3ad3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.030419] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d42f7d0-b60a-4561-9cb5-b104c1610d77 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.063637] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34291bd4-d05b-4651-a838-6df71ae028ff {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.070358] env[69648]: DEBUG oslo_vmware.api [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Task: {'id': task-3466689, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.087302} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2143.071809] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2143.072007] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2143.072239] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2143.072358] env[69648]: INFO nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2143.074156] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-29e2a6fd-a41f-4af8-8930-28f264628d2c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.075991] env[69648]: DEBUG nova.compute.claims [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2143.076171] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2143.076385] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2143.099217] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2143.148550] env[69648]: DEBUG oslo_vmware.rw_handles [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0278433c-efa4-42a1-9b22-af6f8c205f9a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2143.207960] env[69648]: DEBUG oslo_vmware.rw_handles [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2143.207960] env[69648]: DEBUG oslo_vmware.rw_handles [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0278433c-efa4-42a1-9b22-af6f8c205f9a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2143.279819] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd21958d-a548-4841-941c-a759bf21882e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.287799] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d285872-9ee0-4d26-853c-74cb9447c770 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.318986] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-885a5102-a108-42bc-bb3f-0263627aac30 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.325678] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92b94ea0-ad4e-4ce7-a4ac-b2d76913516a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.338590] env[69648]: DEBUG nova.compute.provider_tree [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2143.346853] env[69648]: DEBUG nova.scheduler.client.report [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2143.359516] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.283s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2143.360073] env[69648]: ERROR nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2143.360073] env[69648]: Faults: ['InvalidArgument'] [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Traceback (most recent call last): [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] self.driver.spawn(context, instance, image_meta, [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] self._fetch_image_if_missing(context, vi) [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] image_cache(vi, tmp_image_ds_loc) [ 2143.360073] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] vm_util.copy_virtual_disk( [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] session._wait_for_task(vmdk_copy_task) [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] return self.wait_for_task(task_ref) [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] return evt.wait() [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] result = hub.switch() [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] return self.greenlet.switch() [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2143.360414] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] self.f(*self.args, **self.kw) [ 2143.360765] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2143.360765] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] raise exceptions.translate_fault(task_info.error) [ 2143.360765] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2143.360765] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Faults: ['InvalidArgument'] [ 2143.360765] env[69648]: ERROR nova.compute.manager [instance: 723972b1-3f91-4c59-b265-3975644dadb2] [ 2143.360765] env[69648]: DEBUG nova.compute.utils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2143.362048] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Build of instance 723972b1-3f91-4c59-b265-3975644dadb2 was re-scheduled: A specified parameter was not correct: fileType [ 2143.362048] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2143.362414] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2143.362609] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2143.362803] env[69648]: DEBUG nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2143.362966] env[69648]: DEBUG nova.network.neutron [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2143.901833] env[69648]: DEBUG nova.network.neutron [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2143.914321] env[69648]: INFO nova.compute.manager [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Took 0.55 seconds to deallocate network for instance. [ 2144.009727] env[69648]: INFO nova.scheduler.client.report [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Deleted allocations for instance 723972b1-3f91-4c59-b265-3975644dadb2 [ 2144.034240] env[69648]: DEBUG oslo_concurrency.lockutils [None req-74c9ab25-9d58-4183-95df-9a7b9e3e4f7d tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "723972b1-3f91-4c59-b265-3975644dadb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 560.769s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2144.034240] env[69648]: DEBUG oslo_concurrency.lockutils [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "723972b1-3f91-4c59-b265-3975644dadb2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 365.743s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2144.034240] env[69648]: DEBUG oslo_concurrency.lockutils [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Acquiring lock "723972b1-3f91-4c59-b265-3975644dadb2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2144.034240] env[69648]: DEBUG oslo_concurrency.lockutils [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "723972b1-3f91-4c59-b265-3975644dadb2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2144.034421] env[69648]: DEBUG oslo_concurrency.lockutils [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "723972b1-3f91-4c59-b265-3975644dadb2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2144.036601] env[69648]: INFO nova.compute.manager [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Terminating instance [ 2144.038839] env[69648]: DEBUG nova.compute.manager [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2144.038839] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2144.039377] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b6b1dcb3-21f2-4497-af6e-42127912e9e4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2144.049411] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4de4c435-3a4d-450b-ae6e-476b9077d4d1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2144.078185] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 723972b1-3f91-4c59-b265-3975644dadb2 could not be found. [ 2144.078185] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2144.078185] env[69648]: INFO nova.compute.manager [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2144.078185] env[69648]: DEBUG oslo.service.loopingcall [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2144.078469] env[69648]: DEBUG nova.compute.manager [-] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2144.078717] env[69648]: DEBUG nova.network.neutron [-] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2144.134060] env[69648]: DEBUG nova.network.neutron [-] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2144.142501] env[69648]: INFO nova.compute.manager [-] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] Took 0.06 seconds to deallocate network for instance. [ 2144.232372] env[69648]: DEBUG oslo_concurrency.lockutils [None req-5bc67915-d807-4c2b-b568-c1752427e69b tempest-ImagesTestJSON-1163555157 tempest-ImagesTestJSON-1163555157-project-member] Lock "723972b1-3f91-4c59-b265-3975644dadb2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2144.234333] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "723972b1-3f91-4c59-b265-3975644dadb2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 349.217s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2144.234333] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 723972b1-3f91-4c59-b265-3975644dadb2] During sync_power_state the instance has a pending task (deleting). Skip. [ 2144.234333] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "723972b1-3f91-4c59-b265-3975644dadb2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2163.065590] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2164.065687] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2165.060463] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2165.065170] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2165.065420] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2172.064641] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2172.065038] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2172.065038] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2172.076554] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2172.076765] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2172.077300] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2172.077300] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2172.078198] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01f668b3-30ab-4550-b6a2-48fbcce0d38e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.087040] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-057cbe4a-c8f2-42e1-be47-54a8841b9dc2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.100826] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d9b7c1f-7150-49a2-9963-e2cf4f532205 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.107098] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d427ce7e-5ca2-48e5-a556-e3443cc3ba0c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.135375] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180965MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2172.135516] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2172.135700] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2172.198984] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2172.199203] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2172.199342] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2172.199469] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2172.199591] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2172.199711] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2172.199831] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cc77a95f-ea00-4b01-96ac-8256672eeb39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2172.199947] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f2b5030-4606-4873-a80b-186b841cc7dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2172.200152] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2172.200295] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2172.290081] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36dc8821-005b-45ed-90a7-a17a6a4258f4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.298828] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ab9022a-5b86-4603-a4cc-d50d4ccf89b9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.328663] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e472c581-d111-4240-91b9-6e3c595a2255 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.335558] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b925b19e-4c53-407f-a65f-51599e4f7fac {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.348886] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2172.357078] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2172.371093] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2172.371284] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.236s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2175.372461] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2177.065914] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2177.066224] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2177.066277] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2177.087427] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2177.087585] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2177.087905] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2177.088114] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2177.088255] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2177.088384] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2177.088510] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2177.088634] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2177.088759] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2180.811537] env[69648]: DEBUG oslo_concurrency.lockutils [None req-5330a82c-3e7a-4145-bea7-3ca34a320d89 tempest-ServersNegativeTestJSON-381661346 tempest-ServersNegativeTestJSON-381661346-project-member] Acquiring lock "6f2b5030-4606-4873-a80b-186b841cc7dd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2187.093483] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "349fb4bd-6187-4914-8322-082865bc5562" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2187.094206] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "349fb4bd-6187-4914-8322-082865bc5562" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2187.111857] env[69648]: DEBUG nova.compute.manager [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2187.182850] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2187.183245] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2187.184807] env[69648]: INFO nova.compute.claims [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2187.387193] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e9cc084-cf2b-4a59-bd69-64dc356f2fe4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.394701] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-115a6055-3aa1-4fed-91d5-eac94ea49208 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.428887] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-653225ed-52c8-4fa3-b567-13c9a29d22bd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.439287] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aab3561e-fe9d-4e2b-9558-e7c7441c6762 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.454935] env[69648]: DEBUG nova.compute.provider_tree [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2187.468209] env[69648]: DEBUG nova.scheduler.client.report [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2187.485067] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2187.485574] env[69648]: DEBUG nova.compute.manager [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2187.521746] env[69648]: DEBUG nova.compute.utils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2187.523343] env[69648]: DEBUG nova.compute.manager [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2187.523534] env[69648]: DEBUG nova.network.neutron [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2187.535185] env[69648]: DEBUG nova.compute.manager [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2187.585613] env[69648]: DEBUG nova.policy [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'caf89555b5df4f5fa4cac41f6b1792db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca41677808a749f1b88e43a112db7fb2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 2187.607526] env[69648]: DEBUG nova.compute.manager [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2187.629181] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2187.629457] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2187.629618] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2187.629799] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2187.629949] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2187.630481] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2187.630481] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2187.630683] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2187.630723] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2187.630901] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2187.631032] env[69648]: DEBUG nova.virt.hardware [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2187.632223] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3000a590-1524-4096-a0d4-a0f63b8dfa2c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.640645] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0cfa2f4-0ed1-4c50-be2a-1d20c7daac02 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.315826] env[69648]: DEBUG nova.network.neutron [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Successfully created port: 3273df74-77d0-4a83-acfc-cc91d6366c20 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2188.587613] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Acquiring lock "70632638-9c26-4c7b-a01e-9fa13edd409a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2188.587866] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Lock "70632638-9c26-4c7b-a01e-9fa13edd409a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2188.598616] env[69648]: DEBUG nova.compute.manager [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2188.655489] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2188.655755] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2188.657505] env[69648]: INFO nova.compute.claims [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2188.848883] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4da1809b-ed2a-4e4b-ac60-52e18aaa5056 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.856761] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74bcddc0-d484-4c0c-8082-b5d62d551b31 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.888083] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ac3570e-d33d-4132-bba4-f813d4cc4aae {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.895892] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e958e586-d1cb-4c3f-ae0b-5067cfe70af9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.910859] env[69648]: DEBUG nova.compute.provider_tree [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2188.913060] env[69648]: DEBUG nova.compute.manager [req-3343b13c-fd91-4bd8-9e6e-f5617d294aa1 req-b3bfe42e-87bc-4a04-be0a-66bb38b1175b service nova] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Received event network-vif-plugged-3273df74-77d0-4a83-acfc-cc91d6366c20 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2188.913333] env[69648]: DEBUG oslo_concurrency.lockutils [req-3343b13c-fd91-4bd8-9e6e-f5617d294aa1 req-b3bfe42e-87bc-4a04-be0a-66bb38b1175b service nova] Acquiring lock "349fb4bd-6187-4914-8322-082865bc5562-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2188.913579] env[69648]: DEBUG oslo_concurrency.lockutils [req-3343b13c-fd91-4bd8-9e6e-f5617d294aa1 req-b3bfe42e-87bc-4a04-be0a-66bb38b1175b service nova] Lock "349fb4bd-6187-4914-8322-082865bc5562-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2188.913753] env[69648]: DEBUG oslo_concurrency.lockutils [req-3343b13c-fd91-4bd8-9e6e-f5617d294aa1 req-b3bfe42e-87bc-4a04-be0a-66bb38b1175b service nova] Lock "349fb4bd-6187-4914-8322-082865bc5562-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2188.913919] env[69648]: DEBUG nova.compute.manager [req-3343b13c-fd91-4bd8-9e6e-f5617d294aa1 req-b3bfe42e-87bc-4a04-be0a-66bb38b1175b service nova] [instance: 349fb4bd-6187-4914-8322-082865bc5562] No waiting events found dispatching network-vif-plugged-3273df74-77d0-4a83-acfc-cc91d6366c20 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2188.914199] env[69648]: WARNING nova.compute.manager [req-3343b13c-fd91-4bd8-9e6e-f5617d294aa1 req-b3bfe42e-87bc-4a04-be0a-66bb38b1175b service nova] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Received unexpected event network-vif-plugged-3273df74-77d0-4a83-acfc-cc91d6366c20 for instance with vm_state building and task_state spawning. [ 2188.923661] env[69648]: DEBUG nova.scheduler.client.report [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2188.942883] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2188.943423] env[69648]: DEBUG nova.compute.manager [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2188.989984] env[69648]: DEBUG nova.compute.utils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2188.991401] env[69648]: DEBUG nova.compute.manager [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2188.991569] env[69648]: DEBUG nova.network.neutron [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2189.002711] env[69648]: DEBUG nova.network.neutron [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Successfully updated port: 3273df74-77d0-4a83-acfc-cc91d6366c20 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2189.005433] env[69648]: DEBUG nova.compute.manager [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2189.017280] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "refresh_cache-349fb4bd-6187-4914-8322-082865bc5562" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2189.017417] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired lock "refresh_cache-349fb4bd-6187-4914-8322-082865bc5562" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2189.017569] env[69648]: DEBUG nova.network.neutron [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2189.059751] env[69648]: DEBUG nova.network.neutron [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2189.079559] env[69648]: DEBUG nova.compute.manager [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2189.091104] env[69648]: DEBUG nova.policy [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aeb00590ba994740b110b61bd07ee98a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f13aeffcea10467db8c0130518931a38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 2189.107451] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2189.107451] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2189.107623] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2189.107737] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2189.107877] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2189.108034] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2189.108327] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2189.108408] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2189.108645] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2189.108836] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2189.109021] env[69648]: DEBUG nova.virt.hardware [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2189.109864] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf00690c-c997-4431-9f07-1ad5a30f0d32 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2189.120636] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2374fa3-4d3d-4835-9c0f-f61e6ab30eb9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2189.237765] env[69648]: DEBUG nova.network.neutron [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Updating instance_info_cache with network_info: [{"id": "3273df74-77d0-4a83-acfc-cc91d6366c20", "address": "fa:16:3e:92:a6:df", "network": {"id": "c571b585-f722-4160-a8c7-ee2a6eb60153", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-12468984-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca41677808a749f1b88e43a112db7fb2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3273df74-77", "ovs_interfaceid": "3273df74-77d0-4a83-acfc-cc91d6366c20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2189.259070] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Releasing lock "refresh_cache-349fb4bd-6187-4914-8322-082865bc5562" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2189.259070] env[69648]: DEBUG nova.compute.manager [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Instance network_info: |[{"id": "3273df74-77d0-4a83-acfc-cc91d6366c20", "address": "fa:16:3e:92:a6:df", "network": {"id": "c571b585-f722-4160-a8c7-ee2a6eb60153", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-12468984-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca41677808a749f1b88e43a112db7fb2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3273df74-77", "ovs_interfaceid": "3273df74-77d0-4a83-acfc-cc91d6366c20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2189.259543] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:92:a6:df', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1a55f45a-d631-4ebc-b73b-8a30bd0a32a8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3273df74-77d0-4a83-acfc-cc91d6366c20', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2189.265484] env[69648]: DEBUG oslo.service.loopingcall [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2189.266163] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2189.266524] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-79cf3dc8-dec2-4a02-89d9-1f3993be68ad {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2189.289132] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2189.289132] env[69648]: value = "task-3466690" [ 2189.289132] env[69648]: _type = "Task" [ 2189.289132] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2189.297717] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466690, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2189.423231] env[69648]: DEBUG nova.network.neutron [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Successfully created port: 29dbee66-3044-4f1c-af2a-4a198595fd4a {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2189.799613] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466690, 'name': CreateVM_Task, 'duration_secs': 0.323521} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2189.799814] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2189.800529] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2189.800703] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2189.801087] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2189.801294] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b193b44-43e4-4d1a-add8-ff2866773783 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2189.805746] env[69648]: DEBUG oslo_vmware.api [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for the task: (returnval){ [ 2189.805746] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5279af29-ba83-a09d-e633-824f63bf03cb" [ 2189.805746] env[69648]: _type = "Task" [ 2189.805746] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2189.813141] env[69648]: DEBUG oslo_vmware.api [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5279af29-ba83-a09d-e633-824f63bf03cb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2190.300542] env[69648]: DEBUG nova.compute.manager [req-c6abf16c-c573-4cb1-a016-16a5cc08f562 req-f878dbef-5e0a-489e-99de-99df3ec99f8c service nova] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Received event network-vif-plugged-29dbee66-3044-4f1c-af2a-4a198595fd4a {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2190.300736] env[69648]: DEBUG oslo_concurrency.lockutils [req-c6abf16c-c573-4cb1-a016-16a5cc08f562 req-f878dbef-5e0a-489e-99de-99df3ec99f8c service nova] Acquiring lock "70632638-9c26-4c7b-a01e-9fa13edd409a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2190.300955] env[69648]: DEBUG oslo_concurrency.lockutils [req-c6abf16c-c573-4cb1-a016-16a5cc08f562 req-f878dbef-5e0a-489e-99de-99df3ec99f8c service nova] Lock "70632638-9c26-4c7b-a01e-9fa13edd409a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2190.301519] env[69648]: DEBUG oslo_concurrency.lockutils [req-c6abf16c-c573-4cb1-a016-16a5cc08f562 req-f878dbef-5e0a-489e-99de-99df3ec99f8c service nova] Lock "70632638-9c26-4c7b-a01e-9fa13edd409a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2190.301779] env[69648]: DEBUG nova.compute.manager [req-c6abf16c-c573-4cb1-a016-16a5cc08f562 req-f878dbef-5e0a-489e-99de-99df3ec99f8c service nova] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] No waiting events found dispatching network-vif-plugged-29dbee66-3044-4f1c-af2a-4a198595fd4a {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2190.302132] env[69648]: WARNING nova.compute.manager [req-c6abf16c-c573-4cb1-a016-16a5cc08f562 req-f878dbef-5e0a-489e-99de-99df3ec99f8c service nova] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Received unexpected event network-vif-plugged-29dbee66-3044-4f1c-af2a-4a198595fd4a for instance with vm_state building and task_state spawning. [ 2190.316075] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2190.316379] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2190.316633] env[69648]: DEBUG oslo_concurrency.lockutils [None req-c9ec102d-4944-4112-ad9a-bb769badb7b8 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2190.350472] env[69648]: DEBUG nova.network.neutron [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Successfully updated port: 29dbee66-3044-4f1c-af2a-4a198595fd4a {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2190.361930] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Acquiring lock "refresh_cache-70632638-9c26-4c7b-a01e-9fa13edd409a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2190.362079] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Acquired lock "refresh_cache-70632638-9c26-4c7b-a01e-9fa13edd409a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2190.362362] env[69648]: DEBUG nova.network.neutron [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2190.402245] env[69648]: DEBUG nova.network.neutron [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2190.575781] env[69648]: DEBUG nova.network.neutron [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Updating instance_info_cache with network_info: [{"id": "29dbee66-3044-4f1c-af2a-4a198595fd4a", "address": "fa:16:3e:9a:f3:0d", "network": {"id": "50b7ce8f-9175-4b67-8364-26cdfc36bcb9", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1610618242-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f13aeffcea10467db8c0130518931a38", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd77ecbc-aaaf-45f4-ae8f-977d90e4052f", "external-id": "nsx-vlan-transportzone-171", "segmentation_id": 171, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29dbee66-30", "ovs_interfaceid": "29dbee66-3044-4f1c-af2a-4a198595fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2190.587125] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Releasing lock "refresh_cache-70632638-9c26-4c7b-a01e-9fa13edd409a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2190.587425] env[69648]: DEBUG nova.compute.manager [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Instance network_info: |[{"id": "29dbee66-3044-4f1c-af2a-4a198595fd4a", "address": "fa:16:3e:9a:f3:0d", "network": {"id": "50b7ce8f-9175-4b67-8364-26cdfc36bcb9", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1610618242-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f13aeffcea10467db8c0130518931a38", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd77ecbc-aaaf-45f4-ae8f-977d90e4052f", "external-id": "nsx-vlan-transportzone-171", "segmentation_id": 171, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29dbee66-30", "ovs_interfaceid": "29dbee66-3044-4f1c-af2a-4a198595fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2190.587803] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9a:f3:0d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fd77ecbc-aaaf-45f4-ae8f-977d90e4052f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '29dbee66-3044-4f1c-af2a-4a198595fd4a', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2190.595460] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Creating folder: Project (f13aeffcea10467db8c0130518931a38). Parent ref: group-v692308. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2190.595959] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f3615e02-bcaa-451b-9924-1e9a861ff46f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2190.608819] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Created folder: Project (f13aeffcea10467db8c0130518931a38) in parent group-v692308. [ 2190.608993] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Creating folder: Instances. Parent ref: group-v692422. {{(pid=69648) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2190.609212] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b97c921c-977a-45d9-b9d2-0f50f3ca0eaf {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2190.618608] env[69648]: INFO nova.virt.vmwareapi.vm_util [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Created folder: Instances in parent group-v692422. [ 2190.618940] env[69648]: DEBUG oslo.service.loopingcall [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2190.619217] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2190.619497] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-97a22a88-ff0f-470c-aef6-989b091c87ad {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2190.638201] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2190.638201] env[69648]: value = "task-3466693" [ 2190.638201] env[69648]: _type = "Task" [ 2190.638201] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2190.645054] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466693, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2190.943230] env[69648]: DEBUG nova.compute.manager [req-a5bcaa82-a586-42b5-a375-683fd5244612 req-05461df2-5a9e-45b2-8488-df13e6e64cba service nova] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Received event network-changed-3273df74-77d0-4a83-acfc-cc91d6366c20 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2190.943439] env[69648]: DEBUG nova.compute.manager [req-a5bcaa82-a586-42b5-a375-683fd5244612 req-05461df2-5a9e-45b2-8488-df13e6e64cba service nova] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Refreshing instance network info cache due to event network-changed-3273df74-77d0-4a83-acfc-cc91d6366c20. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 2190.943648] env[69648]: DEBUG oslo_concurrency.lockutils [req-a5bcaa82-a586-42b5-a375-683fd5244612 req-05461df2-5a9e-45b2-8488-df13e6e64cba service nova] Acquiring lock "refresh_cache-349fb4bd-6187-4914-8322-082865bc5562" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2190.943795] env[69648]: DEBUG oslo_concurrency.lockutils [req-a5bcaa82-a586-42b5-a375-683fd5244612 req-05461df2-5a9e-45b2-8488-df13e6e64cba service nova] Acquired lock "refresh_cache-349fb4bd-6187-4914-8322-082865bc5562" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2190.943959] env[69648]: DEBUG nova.network.neutron [req-a5bcaa82-a586-42b5-a375-683fd5244612 req-05461df2-5a9e-45b2-8488-df13e6e64cba service nova] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Refreshing network info cache for port 3273df74-77d0-4a83-acfc-cc91d6366c20 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2191.150713] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466693, 'name': CreateVM_Task, 'duration_secs': 0.283039} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2191.150895] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2191.151603] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2191.151786] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2191.152110] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2191.152457] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d6727b97-c28e-4e2a-91b7-4baf9bc35f61 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2191.156697] env[69648]: DEBUG oslo_vmware.api [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Waiting for the task: (returnval){ [ 2191.156697] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52bc9ea6-f6ee-616b-9cee-85117a929b8f" [ 2191.156697] env[69648]: _type = "Task" [ 2191.156697] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2191.164178] env[69648]: DEBUG oslo_vmware.api [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52bc9ea6-f6ee-616b-9cee-85117a929b8f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2191.187200] env[69648]: DEBUG nova.network.neutron [req-a5bcaa82-a586-42b5-a375-683fd5244612 req-05461df2-5a9e-45b2-8488-df13e6e64cba service nova] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Updated VIF entry in instance network info cache for port 3273df74-77d0-4a83-acfc-cc91d6366c20. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2191.187597] env[69648]: DEBUG nova.network.neutron [req-a5bcaa82-a586-42b5-a375-683fd5244612 req-05461df2-5a9e-45b2-8488-df13e6e64cba service nova] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Updating instance_info_cache with network_info: [{"id": "3273df74-77d0-4a83-acfc-cc91d6366c20", "address": "fa:16:3e:92:a6:df", "network": {"id": "c571b585-f722-4160-a8c7-ee2a6eb60153", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-12468984-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca41677808a749f1b88e43a112db7fb2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3273df74-77", "ovs_interfaceid": "3273df74-77d0-4a83-acfc-cc91d6366c20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2191.197370] env[69648]: DEBUG oslo_concurrency.lockutils [req-a5bcaa82-a586-42b5-a375-683fd5244612 req-05461df2-5a9e-45b2-8488-df13e6e64cba service nova] Releasing lock "refresh_cache-349fb4bd-6187-4914-8322-082865bc5562" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2191.666773] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2191.667053] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2191.667257] env[69648]: DEBUG oslo_concurrency.lockutils [None req-f4f8a03c-0151-48a6-9c9b-ff09b5dbcc09 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2191.963907] env[69648]: WARNING oslo_vmware.rw_handles [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2191.963907] env[69648]: ERROR oslo_vmware.rw_handles [ 2191.964374] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/0278433c-efa4-42a1-9b22-af6f8c205f9a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2191.966929] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2191.967182] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Copying Virtual Disk [datastore1] vmware_temp/0278433c-efa4-42a1-9b22-af6f8c205f9a/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/0278433c-efa4-42a1-9b22-af6f8c205f9a/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2191.967461] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c2385cb2-e04c-4f16-b304-ddae0f65523a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2191.975567] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 2191.975567] env[69648]: value = "task-3466694" [ 2191.975567] env[69648]: _type = "Task" [ 2191.975567] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2191.983300] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': task-3466694, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2192.324704] env[69648]: DEBUG nova.compute.manager [req-cec59165-c4ba-4298-9214-715ad7fd8f21 req-a5530b8e-e7b9-4e80-9da1-f33d94bf34c1 service nova] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Received event network-changed-29dbee66-3044-4f1c-af2a-4a198595fd4a {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2192.324909] env[69648]: DEBUG nova.compute.manager [req-cec59165-c4ba-4298-9214-715ad7fd8f21 req-a5530b8e-e7b9-4e80-9da1-f33d94bf34c1 service nova] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Refreshing instance network info cache due to event network-changed-29dbee66-3044-4f1c-af2a-4a198595fd4a. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 2192.325138] env[69648]: DEBUG oslo_concurrency.lockutils [req-cec59165-c4ba-4298-9214-715ad7fd8f21 req-a5530b8e-e7b9-4e80-9da1-f33d94bf34c1 service nova] Acquiring lock "refresh_cache-70632638-9c26-4c7b-a01e-9fa13edd409a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2192.325286] env[69648]: DEBUG oslo_concurrency.lockutils [req-cec59165-c4ba-4298-9214-715ad7fd8f21 req-a5530b8e-e7b9-4e80-9da1-f33d94bf34c1 service nova] Acquired lock "refresh_cache-70632638-9c26-4c7b-a01e-9fa13edd409a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2192.325447] env[69648]: DEBUG nova.network.neutron [req-cec59165-c4ba-4298-9214-715ad7fd8f21 req-a5530b8e-e7b9-4e80-9da1-f33d94bf34c1 service nova] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Refreshing network info cache for port 29dbee66-3044-4f1c-af2a-4a198595fd4a {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2192.486889] env[69648]: DEBUG oslo_vmware.exceptions [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2192.487224] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2192.487839] env[69648]: ERROR nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2192.487839] env[69648]: Faults: ['InvalidArgument'] [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Traceback (most recent call last): [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] yield resources [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] self.driver.spawn(context, instance, image_meta, [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] self._fetch_image_if_missing(context, vi) [ 2192.487839] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] image_cache(vi, tmp_image_ds_loc) [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] vm_util.copy_virtual_disk( [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] session._wait_for_task(vmdk_copy_task) [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] return self.wait_for_task(task_ref) [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] return evt.wait() [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] result = hub.switch() [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2192.488251] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] return self.greenlet.switch() [ 2192.488647] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2192.488647] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] self.f(*self.args, **self.kw) [ 2192.488647] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2192.488647] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] raise exceptions.translate_fault(task_info.error) [ 2192.488647] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2192.488647] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Faults: ['InvalidArgument'] [ 2192.488647] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] [ 2192.488647] env[69648]: INFO nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Terminating instance [ 2192.489715] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2192.489926] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2192.490562] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2192.490753] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2192.490982] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-055f0952-d439-407c-8fcd-8467f5524c92 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.493288] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1410519c-b017-4285-86be-524a96504fb3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.502841] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2192.503047] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b2685ca4-8877-4c64-9438-baf837203062 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.505157] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2192.505357] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2192.506294] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8292d403-cb24-4c3b-9408-8f69032fbd05 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.510849] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 2192.510849] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5243277c-8c81-8573-5141-c57392f157b7" [ 2192.510849] env[69648]: _type = "Task" [ 2192.510849] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2192.518136] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]5243277c-8c81-8573-5141-c57392f157b7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2192.566128] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2192.566356] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2192.566534] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Deleting the datastore file [datastore1] 18745ec2-477d-427d-b2dd-997f73d9fd53 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2192.566786] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7b23cb78-9d1b-43cb-9950-7910e6f24855 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.573153] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 2192.573153] env[69648]: value = "task-3466696" [ 2192.573153] env[69648]: _type = "Task" [ 2192.573153] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2192.580337] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': task-3466696, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2192.608754] env[69648]: DEBUG nova.network.neutron [req-cec59165-c4ba-4298-9214-715ad7fd8f21 req-a5530b8e-e7b9-4e80-9da1-f33d94bf34c1 service nova] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Updated VIF entry in instance network info cache for port 29dbee66-3044-4f1c-af2a-4a198595fd4a. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2192.609115] env[69648]: DEBUG nova.network.neutron [req-cec59165-c4ba-4298-9214-715ad7fd8f21 req-a5530b8e-e7b9-4e80-9da1-f33d94bf34c1 service nova] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Updating instance_info_cache with network_info: [{"id": "29dbee66-3044-4f1c-af2a-4a198595fd4a", "address": "fa:16:3e:9a:f3:0d", "network": {"id": "50b7ce8f-9175-4b67-8364-26cdfc36bcb9", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1610618242-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f13aeffcea10467db8c0130518931a38", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd77ecbc-aaaf-45f4-ae8f-977d90e4052f", "external-id": "nsx-vlan-transportzone-171", "segmentation_id": 171, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29dbee66-30", "ovs_interfaceid": "29dbee66-3044-4f1c-af2a-4a198595fd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2192.619018] env[69648]: DEBUG oslo_concurrency.lockutils [req-cec59165-c4ba-4298-9214-715ad7fd8f21 req-a5530b8e-e7b9-4e80-9da1-f33d94bf34c1 service nova] Releasing lock "refresh_cache-70632638-9c26-4c7b-a01e-9fa13edd409a" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2193.021528] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2193.021839] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating directory with path [datastore1] vmware_temp/d7a180a6-1aae-4fc9-a688-e8eb7c8b9d8e/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2193.022012] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e9ad3a16-02ac-41a8-903c-6aa7ff63b292 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.037346] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Created directory with path [datastore1] vmware_temp/d7a180a6-1aae-4fc9-a688-e8eb7c8b9d8e/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2193.037549] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Fetch image to [datastore1] vmware_temp/d7a180a6-1aae-4fc9-a688-e8eb7c8b9d8e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2193.037758] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/d7a180a6-1aae-4fc9-a688-e8eb7c8b9d8e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2193.038513] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7b31136-1342-4e34-8523-8a8cf2defbf6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.044992] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6931f884-915e-4895-86cf-239d0777c16d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.054036] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a04eece6-437b-4bc8-8ab8-9f1d0447c3b4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.086269] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62280bfd-9545-440a-ae38-a3277d730237 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.092988] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': task-3466696, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082724} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2193.094382] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2193.094600] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2193.094787] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2193.094962] env[69648]: INFO nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2193.096709] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6b8f5b0d-d7f7-4d45-8dc9-a6abc045fe54 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.098575] env[69648]: DEBUG nova.compute.claims [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2193.098745] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2193.098952] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2193.120032] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2193.171754] env[69648]: DEBUG oslo_vmware.rw_handles [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d7a180a6-1aae-4fc9-a688-e8eb7c8b9d8e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2193.231042] env[69648]: DEBUG oslo_vmware.rw_handles [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2193.231250] env[69648]: DEBUG oslo_vmware.rw_handles [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d7a180a6-1aae-4fc9-a688-e8eb7c8b9d8e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2193.321368] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-323d6e6a-fea3-43c6-8f13-2bc5767598b7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.329440] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22a899d4-5905-4a7a-bf19-4f2c0f12a7d5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.359862] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0d82734-e2b8-41ff-b302-fdb8e581b350 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.367119] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b230839-e921-4682-8669-865d725cd5c5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.379678] env[69648]: DEBUG nova.compute.provider_tree [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2193.388313] env[69648]: DEBUG nova.scheduler.client.report [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2193.401925] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.303s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2193.402520] env[69648]: ERROR nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2193.402520] env[69648]: Faults: ['InvalidArgument'] [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Traceback (most recent call last): [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] self.driver.spawn(context, instance, image_meta, [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] self._fetch_image_if_missing(context, vi) [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] image_cache(vi, tmp_image_ds_loc) [ 2193.402520] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] vm_util.copy_virtual_disk( [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] session._wait_for_task(vmdk_copy_task) [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] return self.wait_for_task(task_ref) [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] return evt.wait() [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] result = hub.switch() [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] return self.greenlet.switch() [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2193.402932] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] self.f(*self.args, **self.kw) [ 2193.403358] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2193.403358] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] raise exceptions.translate_fault(task_info.error) [ 2193.403358] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2193.403358] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Faults: ['InvalidArgument'] [ 2193.403358] env[69648]: ERROR nova.compute.manager [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] [ 2193.403358] env[69648]: DEBUG nova.compute.utils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2193.404607] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Build of instance 18745ec2-477d-427d-b2dd-997f73d9fd53 was re-scheduled: A specified parameter was not correct: fileType [ 2193.404607] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2193.404973] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2193.405161] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2193.405330] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2193.405502] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2193.690205] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2193.700530] env[69648]: INFO nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Took 0.29 seconds to deallocate network for instance. [ 2193.787234] env[69648]: INFO nova.scheduler.client.report [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Deleted allocations for instance 18745ec2-477d-427d-b2dd-997f73d9fd53 [ 2193.809392] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 598.243s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2193.809573] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 402.073s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2193.809799] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "18745ec2-477d-427d-b2dd-997f73d9fd53-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2193.810018] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2193.810229] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2193.812298] env[69648]: INFO nova.compute.manager [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Terminating instance [ 2193.814286] env[69648]: DEBUG nova.compute.manager [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2193.814553] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2193.814830] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-70b99897-073a-4be0-b0f9-7fb3c88d7bfe {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.825020] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aa6e3f7-8a2a-4db6-8d78-ed38d56f5b1a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.852746] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 18745ec2-477d-427d-b2dd-997f73d9fd53 could not be found. [ 2193.852961] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2193.853153] env[69648]: INFO nova.compute.manager [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2193.853438] env[69648]: DEBUG oslo.service.loopingcall [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2193.853662] env[69648]: DEBUG nova.compute.manager [-] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2193.853759] env[69648]: DEBUG nova.network.neutron [-] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2193.879179] env[69648]: DEBUG nova.network.neutron [-] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2193.888119] env[69648]: INFO nova.compute.manager [-] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] Took 0.03 seconds to deallocate network for instance. [ 2193.975173] env[69648]: DEBUG oslo_concurrency.lockutils [None req-62284cab-7f0b-463f-99e6-8c8d0d849ee5 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.165s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2193.975976] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 398.960s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2193.976174] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 18745ec2-477d-427d-b2dd-997f73d9fd53] During sync_power_state the instance has a pending task (deleting). Skip. [ 2193.976345] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "18745ec2-477d-427d-b2dd-997f73d9fd53" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2210.203870] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "85f7d10c-ddc9-4e9f-9462-7e67447bc8d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2210.204224] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "85f7d10c-ddc9-4e9f-9462-7e67447bc8d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2210.214517] env[69648]: DEBUG nova.compute.manager [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Starting instance... {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2210.261690] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2210.261965] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2210.263397] env[69648]: INFO nova.compute.claims [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2210.414246] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b66fdca3-2b01-448c-934c-82e7f21bc06d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2210.422508] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e8f928f-b555-4afc-ac81-bf83aef4191f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2210.451345] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e258b4b0-993e-43b0-8429-6b158a9794c9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2210.458584] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84611329-d705-4564-bfe7-dada6ac3b1c3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2210.472937] env[69648]: DEBUG nova.compute.provider_tree [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2210.483013] env[69648]: DEBUG nova.scheduler.client.report [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2210.496519] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2210.496973] env[69648]: DEBUG nova.compute.manager [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Start building networks asynchronously for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2210.528664] env[69648]: DEBUG nova.compute.utils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Using /dev/sd instead of None {{(pid=69648) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2210.530883] env[69648]: DEBUG nova.compute.manager [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Allocating IP information in the background. {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2210.531204] env[69648]: DEBUG nova.network.neutron [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] allocate_for_instance() {{(pid=69648) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2210.541063] env[69648]: DEBUG nova.compute.manager [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Start building block device mappings for instance. {{(pid=69648) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2210.603526] env[69648]: DEBUG nova.compute.manager [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Start spawning the instance on the hypervisor. {{(pid=69648) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2210.624407] env[69648]: DEBUG nova.policy [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dcce409aea2f4744bda144de55e46052', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd1ecc20de6ab4597a08d93cca45ed56c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=69648) authorize /opt/stack/nova/nova/policy.py:203}} [ 2210.628015] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T18:31:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T18:31:12Z,direct_url=,disk_format='vmdk',id=b010aefa-553b-437c-bd1e-78b0a276a491,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d4fe325ef395451d95fa750759fa3138',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T18:31:13Z,virtual_size=,visibility=), allow threads: False {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2210.628260] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Flavor limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2210.628424] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Image limits 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2210.628634] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Flavor pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2210.628748] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Image pref 0:0:0 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2210.628911] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=69648) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2210.629152] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2210.629337] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2210.629509] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Got 1 possible topologies {{(pid=69648) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2210.629676] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2210.629846] env[69648]: DEBUG nova.virt.hardware [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=69648) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2210.630740] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72a80da8-6b7b-4f75-9697-d8b3f61640c2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2210.638792] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1308908d-4699-42cc-8c9d-ed5fea8caa0f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2210.946198] env[69648]: DEBUG nova.network.neutron [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Successfully created port: 95338bcd-40a0-47d5-94b0-0073ed98b282 {{(pid=69648) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2211.510005] env[69648]: DEBUG nova.compute.manager [req-486f2c49-35bf-4fd7-a3a8-438e794998b2 req-b3f553c8-08f2-4968-bf54-56bbbb42b9b7 service nova] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Received event network-vif-plugged-95338bcd-40a0-47d5-94b0-0073ed98b282 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2211.510247] env[69648]: DEBUG oslo_concurrency.lockutils [req-486f2c49-35bf-4fd7-a3a8-438e794998b2 req-b3f553c8-08f2-4968-bf54-56bbbb42b9b7 service nova] Acquiring lock "85f7d10c-ddc9-4e9f-9462-7e67447bc8d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2211.510458] env[69648]: DEBUG oslo_concurrency.lockutils [req-486f2c49-35bf-4fd7-a3a8-438e794998b2 req-b3f553c8-08f2-4968-bf54-56bbbb42b9b7 service nova] Lock "85f7d10c-ddc9-4e9f-9462-7e67447bc8d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2211.510735] env[69648]: DEBUG oslo_concurrency.lockutils [req-486f2c49-35bf-4fd7-a3a8-438e794998b2 req-b3f553c8-08f2-4968-bf54-56bbbb42b9b7 service nova] Lock "85f7d10c-ddc9-4e9f-9462-7e67447bc8d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2211.510908] env[69648]: DEBUG nova.compute.manager [req-486f2c49-35bf-4fd7-a3a8-438e794998b2 req-b3f553c8-08f2-4968-bf54-56bbbb42b9b7 service nova] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] No waiting events found dispatching network-vif-plugged-95338bcd-40a0-47d5-94b0-0073ed98b282 {{(pid=69648) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2211.511233] env[69648]: WARNING nova.compute.manager [req-486f2c49-35bf-4fd7-a3a8-438e794998b2 req-b3f553c8-08f2-4968-bf54-56bbbb42b9b7 service nova] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Received unexpected event network-vif-plugged-95338bcd-40a0-47d5-94b0-0073ed98b282 for instance with vm_state building and task_state spawning. [ 2211.614522] env[69648]: DEBUG nova.network.neutron [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Successfully updated port: 95338bcd-40a0-47d5-94b0-0073ed98b282 {{(pid=69648) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2211.624660] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "refresh_cache-85f7d10c-ddc9-4e9f-9462-7e67447bc8d6" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2211.624811] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired lock "refresh_cache-85f7d10c-ddc9-4e9f-9462-7e67447bc8d6" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2211.624962] env[69648]: DEBUG nova.network.neutron [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2211.676617] env[69648]: DEBUG nova.network.neutron [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2211.833682] env[69648]: DEBUG nova.network.neutron [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Updating instance_info_cache with network_info: [{"id": "95338bcd-40a0-47d5-94b0-0073ed98b282", "address": "fa:16:3e:fb:d5:92", "network": {"id": "fc9dfa48-a79d-4532-a3ed-e1ad779b2906", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-685310351-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d1ecc20de6ab4597a08d93cca45ed56c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee018eb-75be-4037-a80a-07034d4eae35", "external-id": "nsx-vlan-transportzone-8", "segmentation_id": 8, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap95338bcd-40", "ovs_interfaceid": "95338bcd-40a0-47d5-94b0-0073ed98b282", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2211.843729] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Releasing lock "refresh_cache-85f7d10c-ddc9-4e9f-9462-7e67447bc8d6" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2211.844146] env[69648]: DEBUG nova.compute.manager [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Instance network_info: |[{"id": "95338bcd-40a0-47d5-94b0-0073ed98b282", "address": "fa:16:3e:fb:d5:92", "network": {"id": "fc9dfa48-a79d-4532-a3ed-e1ad779b2906", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-685310351-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d1ecc20de6ab4597a08d93cca45ed56c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee018eb-75be-4037-a80a-07034d4eae35", "external-id": "nsx-vlan-transportzone-8", "segmentation_id": 8, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap95338bcd-40", "ovs_interfaceid": "95338bcd-40a0-47d5-94b0-0073ed98b282", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=69648) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2211.844543] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fb:d5:92', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2ee018eb-75be-4037-a80a-07034d4eae35', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '95338bcd-40a0-47d5-94b0-0073ed98b282', 'vif_model': 'vmxnet3'}] {{(pid=69648) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2211.852183] env[69648]: DEBUG oslo.service.loopingcall [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2211.852664] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Creating VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2211.852893] env[69648]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d6816f69-156a-430b-97f4-7456a5eecb8f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2211.873385] env[69648]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2211.873385] env[69648]: value = "task-3466697" [ 2211.873385] env[69648]: _type = "Task" [ 2211.873385] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2211.880970] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466697, 'name': CreateVM_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2212.384261] env[69648]: DEBUG oslo_vmware.api [-] Task: {'id': task-3466697, 'name': CreateVM_Task, 'duration_secs': 0.322157} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2212.384407] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Created VM on the ESX host {{(pid=69648) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2212.385034] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2212.385306] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2212.385543] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2212.385791] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8d421c1f-fa31-4b26-adc3-41cf38e3739b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2212.390222] env[69648]: DEBUG oslo_vmware.api [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for the task: (returnval){ [ 2212.390222] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]525dd91e-8de2-fcd3-d90e-781565f6c4c2" [ 2212.390222] env[69648]: _type = "Task" [ 2212.390222] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2212.397410] env[69648]: DEBUG oslo_vmware.api [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]525dd91e-8de2-fcd3-d90e-781565f6c4c2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2212.900234] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2212.900579] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Processing image b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2212.900689] env[69648]: DEBUG oslo_concurrency.lockutils [None req-9ad74a7c-f903-4c5d-87ce-a9c0f69ba290 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2213.538374] env[69648]: DEBUG nova.compute.manager [req-3ca306ec-f811-4a02-bba6-8efb62c640da req-dc814aed-86f6-4f78-9db8-c24730dcba62 service nova] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Received event network-changed-95338bcd-40a0-47d5-94b0-0073ed98b282 {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2213.538582] env[69648]: DEBUG nova.compute.manager [req-3ca306ec-f811-4a02-bba6-8efb62c640da req-dc814aed-86f6-4f78-9db8-c24730dcba62 service nova] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Refreshing instance network info cache due to event network-changed-95338bcd-40a0-47d5-94b0-0073ed98b282. {{(pid=69648) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 2213.538806] env[69648]: DEBUG oslo_concurrency.lockutils [req-3ca306ec-f811-4a02-bba6-8efb62c640da req-dc814aed-86f6-4f78-9db8-c24730dcba62 service nova] Acquiring lock "refresh_cache-85f7d10c-ddc9-4e9f-9462-7e67447bc8d6" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2213.538954] env[69648]: DEBUG oslo_concurrency.lockutils [req-3ca306ec-f811-4a02-bba6-8efb62c640da req-dc814aed-86f6-4f78-9db8-c24730dcba62 service nova] Acquired lock "refresh_cache-85f7d10c-ddc9-4e9f-9462-7e67447bc8d6" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2213.539137] env[69648]: DEBUG nova.network.neutron [req-3ca306ec-f811-4a02-bba6-8efb62c640da req-dc814aed-86f6-4f78-9db8-c24730dcba62 service nova] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Refreshing network info cache for port 95338bcd-40a0-47d5-94b0-0073ed98b282 {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2213.767477] env[69648]: DEBUG nova.network.neutron [req-3ca306ec-f811-4a02-bba6-8efb62c640da req-dc814aed-86f6-4f78-9db8-c24730dcba62 service nova] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Updated VIF entry in instance network info cache for port 95338bcd-40a0-47d5-94b0-0073ed98b282. {{(pid=69648) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2213.767835] env[69648]: DEBUG nova.network.neutron [req-3ca306ec-f811-4a02-bba6-8efb62c640da req-dc814aed-86f6-4f78-9db8-c24730dcba62 service nova] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Updating instance_info_cache with network_info: [{"id": "95338bcd-40a0-47d5-94b0-0073ed98b282", "address": "fa:16:3e:fb:d5:92", "network": {"id": "fc9dfa48-a79d-4532-a3ed-e1ad779b2906", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-685310351-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d1ecc20de6ab4597a08d93cca45ed56c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee018eb-75be-4037-a80a-07034d4eae35", "external-id": "nsx-vlan-transportzone-8", "segmentation_id": 8, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap95338bcd-40", "ovs_interfaceid": "95338bcd-40a0-47d5-94b0-0073ed98b282", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2213.776866] env[69648]: DEBUG oslo_concurrency.lockutils [req-3ca306ec-f811-4a02-bba6-8efb62c640da req-dc814aed-86f6-4f78-9db8-c24730dcba62 service nova] Releasing lock "refresh_cache-85f7d10c-ddc9-4e9f-9462-7e67447bc8d6" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2223.065618] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2225.061555] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2225.064190] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2225.064380] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2227.065034] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2228.060607] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2232.065516] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2232.065798] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2233.065540] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2233.077503] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2233.077726] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2233.077911] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2233.078095] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2233.079439] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d0ff7db-96e5-4504-a85e-0935d7a8ae44 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.090125] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e6c1048-9549-4db4-a583-311cefa42207 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.104104] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2386ae9b-2455-448c-9e48-4c959b552db6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.110346] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1c702b5-aef9-415f-976f-3b3742774850 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.139971] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180973MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2233.140140] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2233.140312] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2233.212987] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.213185] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.213317] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.213443] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.213566] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.213697] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cc77a95f-ea00-4b01-96ac-8256672eeb39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.213905] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f2b5030-4606-4873-a80b-186b841cc7dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.214067] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 349fb4bd-6187-4914-8322-082865bc5562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.214193] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 70632638-9c26-4c7b-a01e-9fa13edd409a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.214312] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2233.214517] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2233.214656] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2233.325246] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d23fd4ee-58f5-4b78-9b62-8142fc25d819 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.333213] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98b87f14-489a-4549-9bd7-0b7f5dc7b6b4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.361903] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3707aa67-e08f-41e5-a53d-ada1e83084ec {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.368660] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5e85bf9-cbec-44e2-a89f-69b02c5c9475 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.382578] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2233.390510] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2233.404149] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2233.404324] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2236.403984] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2237.065616] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2237.065809] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2237.065941] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2237.085310] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.085463] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.085598] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.085726] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.085850] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.085976] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.086162] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.086218] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.086338] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.086453] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2237.086574] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2239.024097] env[69648]: WARNING oslo_vmware.rw_handles [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2239.024097] env[69648]: ERROR oslo_vmware.rw_handles [ 2239.024774] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/d7a180a6-1aae-4fc9-a688-e8eb7c8b9d8e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2239.027115] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2239.027373] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Copying Virtual Disk [datastore1] vmware_temp/d7a180a6-1aae-4fc9-a688-e8eb7c8b9d8e/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/d7a180a6-1aae-4fc9-a688-e8eb7c8b9d8e/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2239.027655] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-eccd5a5b-8812-4527-912a-30dd6db71eed {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.037195] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 2239.037195] env[69648]: value = "task-3466698" [ 2239.037195] env[69648]: _type = "Task" [ 2239.037195] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2239.045178] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': task-3466698, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2239.547827] env[69648]: DEBUG oslo_vmware.exceptions [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2239.548148] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2239.548754] env[69648]: ERROR nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2239.548754] env[69648]: Faults: ['InvalidArgument'] [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Traceback (most recent call last): [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] yield resources [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] self.driver.spawn(context, instance, image_meta, [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] self._fetch_image_if_missing(context, vi) [ 2239.548754] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] image_cache(vi, tmp_image_ds_loc) [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] vm_util.copy_virtual_disk( [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] session._wait_for_task(vmdk_copy_task) [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] return self.wait_for_task(task_ref) [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] return evt.wait() [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] result = hub.switch() [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2239.549214] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] return self.greenlet.switch() [ 2239.549601] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2239.549601] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] self.f(*self.args, **self.kw) [ 2239.549601] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2239.549601] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] raise exceptions.translate_fault(task_info.error) [ 2239.549601] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2239.549601] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Faults: ['InvalidArgument'] [ 2239.549601] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] [ 2239.549601] env[69648]: INFO nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Terminating instance [ 2239.550675] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2239.550894] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2239.551159] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1700ebcd-d97f-4939-b61c-87d4c5374e7a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.553412] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2239.553570] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2239.554310] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44cf3784-e33f-41eb-83bd-c84cf8b05157 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.561178] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2239.561389] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fde4a5bd-1226-4394-b6da-a9cf86488b8e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.563621] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2239.563796] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2239.564797] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b29ddf5b-4c9a-4b75-9c62-599a95b420e3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.569560] env[69648]: DEBUG oslo_vmware.api [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Waiting for the task: (returnval){ [ 2239.569560] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f18864-2af4-0b25-ac58-834ef5dcb2fa" [ 2239.569560] env[69648]: _type = "Task" [ 2239.569560] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2239.576543] env[69648]: DEBUG oslo_vmware.api [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52f18864-2af4-0b25-ac58-834ef5dcb2fa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2239.631359] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2239.631614] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2239.631825] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Deleting the datastore file [datastore1] 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2239.632143] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-44167033-298a-424b-980e-93812c21f1ad {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.639074] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for the task: (returnval){ [ 2239.639074] env[69648]: value = "task-3466700" [ 2239.639074] env[69648]: _type = "Task" [ 2239.639074] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2239.646528] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': task-3466700, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2240.080882] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2240.081254] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Creating directory with path [datastore1] vmware_temp/b7d4df02-661f-4181-9847-81f91aeb4d53/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2240.081357] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0f263a2c-fa8e-4bcd-94dc-e4a1bec9805f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.093765] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Created directory with path [datastore1] vmware_temp/b7d4df02-661f-4181-9847-81f91aeb4d53/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2240.093991] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Fetch image to [datastore1] vmware_temp/b7d4df02-661f-4181-9847-81f91aeb4d53/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2240.094137] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/b7d4df02-661f-4181-9847-81f91aeb4d53/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2240.094905] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fd6dfee-e1e7-4a75-887c-74b77291136d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.101864] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-723fa983-fd01-4a12-a018-90a3d23dc82f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.111292] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a1628c8-a22b-4adb-8d43-593c1cdbdea4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.146918] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cda88b1-b951-4293-a01e-28e5a91f3ed1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.154849] env[69648]: DEBUG oslo_vmware.api [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Task: {'id': task-3466700, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08097} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2240.156608] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2240.156819] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2240.156994] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2240.157190] env[69648]: INFO nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2240.159949] env[69648]: DEBUG nova.compute.claims [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2240.160138] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2240.160361] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2240.163024] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3ee3a09e-1b68-4216-b148-47823e77be92 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.201035] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2240.347288] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e47a9c4-662d-4779-a6a2-fcdb0f626fe9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.356080] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42dd7330-48b2-46fa-88d1-31843450ca77 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.386581] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2240.387410] env[69648]: ERROR nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Traceback (most recent call last): [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] result = getattr(controller, method)(*args, **kwargs) [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self._get(image_id) [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 2240.387410] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] resp, body = self.http_client.get(url, headers=header) [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self.request(url, 'GET', **kwargs) [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self._handle_response(resp) [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] raise exc.from_response(resp, resp.content) [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] During handling of the above exception, another exception occurred: [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] [ 2240.387905] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Traceback (most recent call last): [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] yield resources [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self.driver.spawn(context, instance, image_meta, [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self._fetch_image_if_missing(context, vi) [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] image_fetch(context, vi, tmp_image_ds_loc) [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] images.fetch_image( [ 2240.388458] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] metadata = IMAGE_API.get(context, image_ref) [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return session.show(context, image_id, [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] _reraise_translated_image_exception(image_id) [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] raise new_exc.with_traceback(exc_trace) [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] result = getattr(controller, method)(*args, **kwargs) [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 2240.388998] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self._get(image_id) [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] resp, body = self.http_client.get(url, headers=header) [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self.request(url, 'GET', **kwargs) [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self._handle_response(resp) [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] raise exc.from_response(resp, resp.content) [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 2240.389591] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] [ 2240.390136] env[69648]: INFO nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Terminating instance [ 2240.390136] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "refresh_cache-0a8de1d1-a783-4a32-9ee0-abb023943eeb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2240.390136] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquired lock "refresh_cache-0a8de1d1-a783-4a32-9ee0-abb023943eeb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2240.390299] env[69648]: DEBUG nova.network.neutron [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2240.392800] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c45f7b6c-1673-4646-ae88-8a7f2ae9c959 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.395833] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2240.396144] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2240.396375] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b9fd63c7-ba0e-4e10-ae4f-caa45ea6e409 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.405650] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d964efc4-40b3-474b-a189-82bb20239dcc {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.412124] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2240.412325] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2240.413445] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c18c5d41-a719-4c59-b1e9-022c9e50db74 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.423799] env[69648]: DEBUG nova.compute.provider_tree [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2240.426359] env[69648]: DEBUG nova.network.neutron [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2240.431176] env[69648]: DEBUG oslo_vmware.api [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for the task: (returnval){ [ 2240.431176] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]525b7c0b-9e48-7122-a028-95eeda74f533" [ 2240.431176] env[69648]: _type = "Task" [ 2240.431176] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2240.436486] env[69648]: DEBUG nova.scheduler.client.report [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2240.447945] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2240.448780] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Creating directory with path [datastore1] vmware_temp/f8f5a70c-1344-4b5f-85d6-a5c209d0deb8/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2240.449491] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b2977b8e-b0b9-4eb7-9f51-c888c02ba1f8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.453027] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.293s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2240.453549] env[69648]: ERROR nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2240.453549] env[69648]: Faults: ['InvalidArgument'] [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Traceback (most recent call last): [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] self.driver.spawn(context, instance, image_meta, [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] self._fetch_image_if_missing(context, vi) [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] image_cache(vi, tmp_image_ds_loc) [ 2240.453549] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] vm_util.copy_virtual_disk( [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] session._wait_for_task(vmdk_copy_task) [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] return self.wait_for_task(task_ref) [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] return evt.wait() [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] result = hub.switch() [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] return self.greenlet.switch() [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2240.453876] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] self.f(*self.args, **self.kw) [ 2240.454190] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2240.454190] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] raise exceptions.translate_fault(task_info.error) [ 2240.454190] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2240.454190] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Faults: ['InvalidArgument'] [ 2240.454190] env[69648]: ERROR nova.compute.manager [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] [ 2240.454307] env[69648]: DEBUG nova.compute.utils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2240.456513] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Build of instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd was re-scheduled: A specified parameter was not correct: fileType [ 2240.456513] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2240.456914] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2240.457108] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2240.457299] env[69648]: DEBUG nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2240.457462] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2240.475721] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Created directory with path [datastore1] vmware_temp/f8f5a70c-1344-4b5f-85d6-a5c209d0deb8/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2240.475974] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Fetch image to [datastore1] vmware_temp/f8f5a70c-1344-4b5f-85d6-a5c209d0deb8/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2240.476105] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/f8f5a70c-1344-4b5f-85d6-a5c209d0deb8/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2240.476915] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68e8b8ed-4800-4bd1-9b8b-0d4e31ec1bda {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.483908] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc34da1d-de61-47c3-ab95-f3aca3563358 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.494038] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea403768-5f39-47bd-8975-53d926718436 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.527087] env[69648]: DEBUG nova.network.neutron [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2240.528504] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d78ad590-8a38-47df-9208-a22351c9439c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.535841] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Releasing lock "refresh_cache-0a8de1d1-a783-4a32-9ee0-abb023943eeb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2240.536255] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2240.536448] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2240.538839] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e433507-c96d-435d-9c66-36e01957c90a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.542580] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c12bd71d-df02-4993-8569-ba367a36a387 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.549033] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2240.549282] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-81f92c28-169c-477f-a26f-bf90c91871b2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.565334] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2240.581882] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2240.582134] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2240.582307] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Deleting the datastore file [datastore1] 0a8de1d1-a783-4a32-9ee0-abb023943eeb {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2240.582616] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-af73a312-c1c6-4f63-92af-86a726d3d799 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.589762] env[69648]: DEBUG oslo_vmware.api [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Waiting for the task: (returnval){ [ 2240.589762] env[69648]: value = "task-3466702" [ 2240.589762] env[69648]: _type = "Task" [ 2240.589762] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2240.601308] env[69648]: DEBUG oslo_vmware.api [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Task: {'id': task-3466702, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2240.632995] env[69648]: DEBUG oslo_vmware.rw_handles [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f8f5a70c-1344-4b5f-85d6-a5c209d0deb8/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2240.698591] env[69648]: DEBUG oslo_vmware.rw_handles [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2240.698872] env[69648]: DEBUG oslo_vmware.rw_handles [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f8f5a70c-1344-4b5f-85d6-a5c209d0deb8/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2241.053267] env[69648]: DEBUG nova.network.neutron [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2241.069386] env[69648]: INFO nova.compute.manager [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Took 0.61 seconds to deallocate network for instance. [ 2241.101620] env[69648]: DEBUG oslo_vmware.api [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Task: {'id': task-3466702, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.041037} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2241.101953] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2241.102165] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2241.102362] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2241.102536] env[69648]: INFO nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Took 0.57 seconds to destroy the instance on the hypervisor. [ 2241.102791] env[69648]: DEBUG oslo.service.loopingcall [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2241.103055] env[69648]: DEBUG nova.compute.manager [-] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 2241.105223] env[69648]: DEBUG nova.compute.claims [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2241.105396] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2241.105610] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.182607] env[69648]: INFO nova.scheduler.client.report [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Deleted allocations for instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd [ 2241.206138] env[69648]: DEBUG oslo_concurrency.lockutils [None req-eaab1fbf-daa7-4652-9ea5-8147b958dd80 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 645.612s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.206414] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 449.531s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.206646] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Acquiring lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2241.206851] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.207030] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.209231] env[69648]: INFO nova.compute.manager [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Terminating instance [ 2241.211053] env[69648]: DEBUG nova.compute.manager [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2241.211264] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2241.211789] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-93f9dc3c-e512-4d60-856f-817c0cfb5ca9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.225401] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b94de2c6-ca2b-4a47-a0e3-83d1028e5953 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.256391] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd could not be found. [ 2241.256590] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2241.256766] env[69648]: INFO nova.compute.manager [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Took 0.05 seconds to destroy the instance on the hypervisor. [ 2241.257016] env[69648]: DEBUG oslo.service.loopingcall [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2241.259476] env[69648]: DEBUG nova.compute.manager [-] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2241.259581] env[69648]: DEBUG nova.network.neutron [-] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2241.283895] env[69648]: DEBUG nova.network.neutron [-] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2241.291891] env[69648]: INFO nova.compute.manager [-] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] Took 0.03 seconds to deallocate network for instance. [ 2241.306222] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2501ba3b-c849-49d3-ae1a-2bffb75203f2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.314495] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43b6bf9a-504a-4744-9818-d3e37e9cc3cd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.348915] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96f862f7-ff4a-41b0-b5f2-c3f8231d2dbb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.356862] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58cb9bf6-0e6c-413b-87e0-7e81f47dcdae {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.372409] env[69648]: DEBUG nova.compute.provider_tree [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2241.380899] env[69648]: DEBUG nova.scheduler.client.report [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2241.397030] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.291s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.397708] env[69648]: ERROR nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Traceback (most recent call last): [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] result = getattr(controller, method)(*args, **kwargs) [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self._get(image_id) [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 2241.397708] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] resp, body = self.http_client.get(url, headers=header) [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self.request(url, 'GET', **kwargs) [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self._handle_response(resp) [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] raise exc.from_response(resp, resp.content) [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] During handling of the above exception, another exception occurred: [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] [ 2241.398066] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Traceback (most recent call last): [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self.driver.spawn(context, instance, image_meta, [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self._fetch_image_if_missing(context, vi) [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] image_fetch(context, vi, tmp_image_ds_loc) [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] images.fetch_image( [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] metadata = IMAGE_API.get(context, image_ref) [ 2241.398383] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return session.show(context, image_id, [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] _reraise_translated_image_exception(image_id) [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] raise new_exc.with_traceback(exc_trace) [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] result = getattr(controller, method)(*args, **kwargs) [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self._get(image_id) [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 2241.398806] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] resp, body = self.http_client.get(url, headers=header) [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self.request(url, 'GET', **kwargs) [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self._handle_response(resp) [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] raise exc.from_response(resp, resp.content) [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] nova.exception.ImageNotAuthorized: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. [ 2241.399171] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] [ 2241.399171] env[69648]: DEBUG nova.compute.utils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2241.399779] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Build of instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb was re-scheduled: Not authorized for image b010aefa-553b-437c-bd1e-78b0a276a491. {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2241.400250] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2241.400475] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "refresh_cache-0a8de1d1-a783-4a32-9ee0-abb023943eeb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2241.400623] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquired lock "refresh_cache-0a8de1d1-a783-4a32-9ee0-abb023943eeb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2241.400783] env[69648]: DEBUG nova.network.neutron [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2241.406931] env[69648]: DEBUG oslo_concurrency.lockutils [None req-dd74f05a-9704-42d9-a1ad-8e5d91f509f3 tempest-MultipleCreateTestJSON-846010339 tempest-MultipleCreateTestJSON-846010339-project-member] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.200s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.407391] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 446.391s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.407571] env[69648]: INFO nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 3e98f3e7-bcc8-4444-883a-6ad0fe6145cd] During sync_power_state the instance has a pending task (deleting). Skip. [ 2241.407742] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "3e98f3e7-bcc8-4444-883a-6ad0fe6145cd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.426499] env[69648]: DEBUG nova.network.neutron [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2241.489319] env[69648]: DEBUG nova.network.neutron [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2241.498742] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Releasing lock "refresh_cache-0a8de1d1-a783-4a32-9ee0-abb023943eeb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2241.499014] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2241.499225] env[69648]: DEBUG nova.compute.manager [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Skipping network deallocation for instance since networking was not requested. {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 2241.582849] env[69648]: INFO nova.scheduler.client.report [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Deleted allocations for instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb [ 2241.599770] env[69648]: DEBUG oslo_concurrency.lockutils [None req-be4b9098-b8e7-4026-970f-e635819df248 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "0a8de1d1-a783-4a32-9ee0-abb023943eeb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 637.437s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.599990] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "0a8de1d1-a783-4a32-9ee0-abb023943eeb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 441.692s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.600393] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "0a8de1d1-a783-4a32-9ee0-abb023943eeb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2241.600479] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "0a8de1d1-a783-4a32-9ee0-abb023943eeb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.600572] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "0a8de1d1-a783-4a32-9ee0-abb023943eeb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.602370] env[69648]: INFO nova.compute.manager [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Terminating instance [ 2241.604068] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquiring lock "refresh_cache-0a8de1d1-a783-4a32-9ee0-abb023943eeb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2241.604236] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Acquired lock "refresh_cache-0a8de1d1-a783-4a32-9ee0-abb023943eeb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2241.604401] env[69648]: DEBUG nova.network.neutron [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Building network info cache for instance {{(pid=69648) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2241.630754] env[69648]: DEBUG nova.network.neutron [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Instance cache missing network info. {{(pid=69648) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2241.691120] env[69648]: DEBUG nova.network.neutron [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2241.700274] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Releasing lock "refresh_cache-0a8de1d1-a783-4a32-9ee0-abb023943eeb" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2241.700692] env[69648]: DEBUG nova.compute.manager [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2241.700890] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2241.701426] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-924186e7-2376-4fe6-a26b-aca70dc337fd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.711548] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-546cdd52-4aae-46a4-b95b-a70a08e8c96d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.739838] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0a8de1d1-a783-4a32-9ee0-abb023943eeb could not be found. [ 2241.740044] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2241.740229] env[69648]: INFO nova.compute.manager [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2241.740463] env[69648]: DEBUG oslo.service.loopingcall [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2241.740678] env[69648]: DEBUG nova.compute.manager [-] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2241.740786] env[69648]: DEBUG nova.network.neutron [-] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2241.845910] env[69648]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=69648) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 2241.846208] env[69648]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-43c9e28f-86e6-458b-83e4-2f830493fa10'] [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2241.846737] env[69648]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall self._deallocate_network( [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2241.847192] env[69648]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2241.847671] env[69648]: ERROR oslo.service.loopingcall [ 2241.848108] env[69648]: ERROR nova.compute.manager [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2241.874675] env[69648]: ERROR nova.compute.manager [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Traceback (most recent call last): [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] ret = obj(*args, **kwargs) [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] exception_handler_v20(status_code, error_body) [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] raise client_exc(message=error_message, [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Neutron server returns request_ids: ['req-43c9e28f-86e6-458b-83e4-2f830493fa10'] [ 2241.874675] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] During handling of the above exception, another exception occurred: [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Traceback (most recent call last): [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self._delete_instance(context, instance, bdms) [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self._shutdown_instance(context, instance, bdms) [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self._try_deallocate_network(context, instance, requested_networks) [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] with excutils.save_and_reraise_exception(): [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2241.875094] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self.force_reraise() [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] raise self.value [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] _deallocate_network_with_retries() [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return evt.wait() [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] result = hub.switch() [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self.greenlet.switch() [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2241.875455] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] result = func(*self.args, **self.kw) [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] result = f(*args, **kwargs) [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self._deallocate_network( [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self.network_api.deallocate_for_instance( [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] data = neutron.list_ports(**search_opts) [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] ret = obj(*args, **kwargs) [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self.list('ports', self.ports_path, retrieve_all, [ 2241.875794] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] ret = obj(*args, **kwargs) [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] for r in self._pagination(collection, path, **params): [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] res = self.get(path, params=params) [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] ret = obj(*args, **kwargs) [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self.retry_request("GET", action, body=body, [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] ret = obj(*args, **kwargs) [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2241.876154] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] return self.do_request(method, action, body=body, [ 2241.877263] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.877263] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] ret = obj(*args, **kwargs) [ 2241.877263] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2241.877263] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] self._handle_fault_response(status_code, replybody, resp) [ 2241.877263] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2241.877263] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2241.877263] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2241.877263] env[69648]: ERROR nova.compute.manager [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] [ 2241.903118] env[69648]: DEBUG oslo_concurrency.lockutils [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Lock "0a8de1d1-a783-4a32-9ee0-abb023943eeb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.303s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.967255] env[69648]: INFO nova.compute.manager [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] [instance: 0a8de1d1-a783-4a32-9ee0-abb023943eeb] Successfully reverted task state from None on failure for instance. [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server [None req-41ce080b-f1d2-42e2-9982-73ecaa3f9d86 tempest-ServerShowV257Test-283671334 tempest-ServerShowV257Test-283671334-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-43c9e28f-86e6-458b-83e4-2f830493fa10'] [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 2241.970727] env[69648]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2241.971259] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 2241.971813] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server raise self.value [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 2241.972317] env[69648]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server return evt.wait() [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.972805] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2241.973474] env[69648]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2241.974192] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2241.974192] env[69648]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 2241.974192] env[69648]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2241.974192] env[69648]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2241.974192] env[69648]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2241.974192] env[69648]: ERROR oslo_messaging.rpc.server [ 2283.065600] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2285.061617] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2285.064272] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2286.065931] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2289.040363] env[69648]: WARNING oslo_vmware.rw_handles [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2289.040363] env[69648]: ERROR oslo_vmware.rw_handles [ 2289.041064] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/f8f5a70c-1344-4b5f-85d6-a5c209d0deb8/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2289.043052] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2289.043257] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Copying Virtual Disk [datastore1] vmware_temp/f8f5a70c-1344-4b5f-85d6-a5c209d0deb8/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/f8f5a70c-1344-4b5f-85d6-a5c209d0deb8/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2289.043599] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-734e65e2-5c17-4aba-ac7b-95256c353fa2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.051399] env[69648]: DEBUG oslo_vmware.api [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for the task: (returnval){ [ 2289.051399] env[69648]: value = "task-3466703" [ 2289.051399] env[69648]: _type = "Task" [ 2289.051399] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2289.059596] env[69648]: DEBUG oslo_vmware.api [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Task: {'id': task-3466703, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2289.065156] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2289.567038] env[69648]: DEBUG oslo_vmware.exceptions [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2289.567432] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2289.568316] env[69648]: ERROR nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2289.568316] env[69648]: Faults: ['InvalidArgument'] [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Traceback (most recent call last): [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] yield resources [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] self.driver.spawn(context, instance, image_meta, [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] self._fetch_image_if_missing(context, vi) [ 2289.568316] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] image_cache(vi, tmp_image_ds_loc) [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] vm_util.copy_virtual_disk( [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] session._wait_for_task(vmdk_copy_task) [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] return self.wait_for_task(task_ref) [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] return evt.wait() [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] result = hub.switch() [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2289.568649] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] return self.greenlet.switch() [ 2289.569068] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2289.569068] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] self.f(*self.args, **self.kw) [ 2289.569068] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2289.569068] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] raise exceptions.translate_fault(task_info.error) [ 2289.569068] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2289.569068] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Faults: ['InvalidArgument'] [ 2289.569068] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] [ 2289.569068] env[69648]: INFO nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Terminating instance [ 2289.572727] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2289.573055] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2289.573507] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2289.573829] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2289.574959] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a64b4a4-be97-4932-ae4f-b0a812af8275 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.579032] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c01b5cf3-c836-452c-925a-2436a102f6c9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.587040] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2289.587350] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5437ed7c-0308-4a61-addf-f9ba332cc063 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.590457] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2289.590765] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2289.592166] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1422ad63-9824-48e4-bbcd-28c25f1ba7e5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.598133] env[69648]: DEBUG oslo_vmware.api [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for the task: (returnval){ [ 2289.598133] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]523ba04c-fe55-adf1-6439-89099a544b2a" [ 2289.598133] env[69648]: _type = "Task" [ 2289.598133] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2289.608618] env[69648]: DEBUG oslo_vmware.api [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]523ba04c-fe55-adf1-6439-89099a544b2a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2289.667512] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2289.667797] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2289.667944] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Deleting the datastore file [datastore1] 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2289.668240] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e0d31676-8a37-4662-a0f2-115183ea9f7f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.674635] env[69648]: DEBUG oslo_vmware.api [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for the task: (returnval){ [ 2289.674635] env[69648]: value = "task-3466705" [ 2289.674635] env[69648]: _type = "Task" [ 2289.674635] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2289.682607] env[69648]: DEBUG oslo_vmware.api [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Task: {'id': task-3466705, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2290.111173] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2290.111479] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Creating directory with path [datastore1] vmware_temp/5e70dabd-b253-4bc7-8f0c-abaa3d5d4906/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2290.111773] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-db29deeb-e00e-46bb-8957-67994ad66fa9 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.124360] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Created directory with path [datastore1] vmware_temp/5e70dabd-b253-4bc7-8f0c-abaa3d5d4906/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2290.124565] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Fetch image to [datastore1] vmware_temp/5e70dabd-b253-4bc7-8f0c-abaa3d5d4906/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2290.124708] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/5e70dabd-b253-4bc7-8f0c-abaa3d5d4906/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2290.125531] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b421ad9b-73b9-4d80-9e85-f72f25289159 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.132052] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06911091-e91b-4b04-a576-2dde7083b88a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.141687] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5a24e9b-4a15-4b9c-9bfb-eb297665552e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.171699] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-260fef77-0c20-433b-a4ac-4b3d06c13955 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.179803] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4a528dc6-5574-41c2-a8a2-1cc4e7d38590 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.184137] env[69648]: DEBUG oslo_vmware.api [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Task: {'id': task-3466705, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.088619} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2290.184649] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2290.184840] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2290.185024] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2290.185206] env[69648]: INFO nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2290.187384] env[69648]: DEBUG nova.compute.claims [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2290.187570] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2290.187771] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2290.206510] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2290.260298] env[69648]: DEBUG oslo_vmware.rw_handles [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5e70dabd-b253-4bc7-8f0c-abaa3d5d4906/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2290.321335] env[69648]: DEBUG oslo_vmware.rw_handles [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2290.321522] env[69648]: DEBUG oslo_vmware.rw_handles [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5e70dabd-b253-4bc7-8f0c-abaa3d5d4906/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2290.389083] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53483996-3698-4289-94a6-d2a2dc77039b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.396584] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2f952b5-3fd3-4097-a95f-e02914b8e060 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.427282] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5686a405-b820-46c1-90f7-9bd95e5ad7e7 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.433929] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af9c1deb-90a6-47a1-9228-bf12217ebeb2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.446663] env[69648]: DEBUG nova.compute.provider_tree [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2290.454788] env[69648]: DEBUG nova.scheduler.client.report [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2290.469168] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.281s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2290.469682] env[69648]: ERROR nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2290.469682] env[69648]: Faults: ['InvalidArgument'] [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Traceback (most recent call last): [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] self.driver.spawn(context, instance, image_meta, [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] self._fetch_image_if_missing(context, vi) [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] image_cache(vi, tmp_image_ds_loc) [ 2290.469682] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] vm_util.copy_virtual_disk( [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] session._wait_for_task(vmdk_copy_task) [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] return self.wait_for_task(task_ref) [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] return evt.wait() [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] result = hub.switch() [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] return self.greenlet.switch() [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2290.470046] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] self.f(*self.args, **self.kw) [ 2290.470400] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2290.470400] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] raise exceptions.translate_fault(task_info.error) [ 2290.470400] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2290.470400] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Faults: ['InvalidArgument'] [ 2290.470400] env[69648]: ERROR nova.compute.manager [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] [ 2290.470400] env[69648]: DEBUG nova.compute.utils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2290.471719] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Build of instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 was re-scheduled: A specified parameter was not correct: fileType [ 2290.471719] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2290.472122] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2290.472299] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2290.472470] env[69648]: DEBUG nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2290.472634] env[69648]: DEBUG nova.network.neutron [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2290.892063] env[69648]: DEBUG nova.network.neutron [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2290.904194] env[69648]: INFO nova.compute.manager [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Took 0.43 seconds to deallocate network for instance. [ 2291.008859] env[69648]: INFO nova.scheduler.client.report [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Deleted allocations for instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 [ 2291.030034] env[69648]: DEBUG oslo_concurrency.lockutils [None req-06ff4030-1c0a-4a04-b23a-3878af7932ab tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 507.431s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2291.030307] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 311.720s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2291.030531] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Acquiring lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2291.031603] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2291.031603] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2291.032966] env[69648]: INFO nova.compute.manager [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Terminating instance [ 2291.034845] env[69648]: DEBUG nova.compute.manager [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2291.035056] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2291.035537] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f86d07a6-5f8e-4801-99bb-b3c416b854fa {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.044646] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-657f6c15-66e9-43ce-bff7-899461f0db19 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.071521] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3d53af88-d0ea-4aff-a36b-23eb2c07bd68 could not be found. [ 2291.071744] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2291.071923] env[69648]: INFO nova.compute.manager [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2291.072196] env[69648]: DEBUG oslo.service.loopingcall [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2291.072682] env[69648]: DEBUG nova.compute.manager [-] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2291.072787] env[69648]: DEBUG nova.network.neutron [-] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2291.106783] env[69648]: DEBUG nova.network.neutron [-] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2291.114956] env[69648]: INFO nova.compute.manager [-] [instance: 3d53af88-d0ea-4aff-a36b-23eb2c07bd68] Took 0.04 seconds to deallocate network for instance. [ 2291.211772] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b1cfcfc2-ac65-4123-8fc3-0fec800d3a03 tempest-AttachVolumeTestJSON-1843339312 tempest-AttachVolumeTestJSON-1843339312-project-member] Lock "3d53af88-d0ea-4aff-a36b-23eb2c07bd68" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2292.065409] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2292.065409] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2295.066602] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2295.077846] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2295.078096] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2295.078276] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2295.078438] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2295.079571] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2c46ec7-5b0b-4d69-bac3-4ead38be096f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.088409] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2223911a-b158-4925-b41e-b6c6a45388d8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.102545] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c00758c0-99ab-41d7-86bc-e9f0d45cb966 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.108839] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ef3bed8-bc27-42c3-ac21-8ea581eb3c21 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.137457] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180963MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2295.137604] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2295.137804] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2295.201319] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2295.202046] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2295.202046] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cc77a95f-ea00-4b01-96ac-8256672eeb39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2295.202250] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f2b5030-4606-4873-a80b-186b841cc7dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2295.202250] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 349fb4bd-6187-4914-8322-082865bc5562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2295.202358] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 70632638-9c26-4c7b-a01e-9fa13edd409a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2295.202470] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2295.202652] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2295.202793] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2295.286590] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6397fbf4-23db-4ec7-b4cc-d528aeadb74a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.294378] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9397f9b8-4aef-4973-8179-961c9ba81194 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.324715] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95bdbae3-20d0-458c-b883-245d4e085801 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.331612] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee157a80-8e26-4095-ad6b-c874d19956ff {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.344308] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2295.353030] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2295.365490] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2295.365649] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2297.364929] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2297.365274] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2297.365274] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2297.382058] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2297.382234] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2297.382374] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2297.382506] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2297.382629] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2297.382754] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2297.382874] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2297.382997] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2297.383728] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.927151] env[69648]: WARNING oslo_vmware.rw_handles [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2339.927151] env[69648]: ERROR oslo_vmware.rw_handles [ 2339.927971] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/5e70dabd-b253-4bc7-8f0c-abaa3d5d4906/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2339.929829] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2339.930096] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Copying Virtual Disk [datastore1] vmware_temp/5e70dabd-b253-4bc7-8f0c-abaa3d5d4906/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/5e70dabd-b253-4bc7-8f0c-abaa3d5d4906/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2339.930395] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6bfa2c35-bf1d-47a9-9001-27979464fd23 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2339.938246] env[69648]: DEBUG oslo_vmware.api [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for the task: (returnval){ [ 2339.938246] env[69648]: value = "task-3466706" [ 2339.938246] env[69648]: _type = "Task" [ 2339.938246] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2339.946163] env[69648]: DEBUG oslo_vmware.api [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': task-3466706, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2340.448773] env[69648]: DEBUG oslo_vmware.exceptions [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2340.449065] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2340.449624] env[69648]: ERROR nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2340.449624] env[69648]: Faults: ['InvalidArgument'] [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Traceback (most recent call last): [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] yield resources [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] self.driver.spawn(context, instance, image_meta, [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] self._fetch_image_if_missing(context, vi) [ 2340.449624] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] image_cache(vi, tmp_image_ds_loc) [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] vm_util.copy_virtual_disk( [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] session._wait_for_task(vmdk_copy_task) [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] return self.wait_for_task(task_ref) [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] return evt.wait() [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] result = hub.switch() [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2340.450020] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] return self.greenlet.switch() [ 2340.450419] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2340.450419] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] self.f(*self.args, **self.kw) [ 2340.450419] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2340.450419] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] raise exceptions.translate_fault(task_info.error) [ 2340.450419] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2340.450419] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Faults: ['InvalidArgument'] [ 2340.450419] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] [ 2340.450419] env[69648]: INFO nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Terminating instance [ 2340.451438] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2340.451653] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2340.451883] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-adf90bcc-452f-4f4c-a4ae-0c2bda2c9520 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.453975] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2340.454213] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2340.454914] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-958dba1f-1cff-4986-bedd-6f042664b48d {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.462289] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2340.462494] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5edf87b2-3458-4587-a965-e93e5194ce83 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.464587] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2340.464769] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2340.465700] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4ea936f3-3ba7-4ea3-9a90-22e85cbdee0b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.470097] env[69648]: DEBUG oslo_vmware.api [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for the task: (returnval){ [ 2340.470097] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528691b8-b3a0-8b26-aae4-a91ddb583172" [ 2340.470097] env[69648]: _type = "Task" [ 2340.470097] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2340.476747] env[69648]: DEBUG oslo_vmware.api [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]528691b8-b3a0-8b26-aae4-a91ddb583172, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2340.528712] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2340.528885] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2340.529083] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Deleting the datastore file [datastore1] f13b5f54-2f87-4c7a-9751-4dc5b7762b83 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2340.529354] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7e166c7b-d915-4a07-8b39-af2aa75c3469 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.535300] env[69648]: DEBUG oslo_vmware.api [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for the task: (returnval){ [ 2340.535300] env[69648]: value = "task-3466708" [ 2340.535300] env[69648]: _type = "Task" [ 2340.535300] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2340.542736] env[69648]: DEBUG oslo_vmware.api [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': task-3466708, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2340.980500] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2340.980791] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Creating directory with path [datastore1] vmware_temp/cf6b3bfc-f5e3-4945-b38c-df9b089dd44b/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2340.981012] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ff4a8736-1a9c-4f67-aebf-21447ccb4200 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.992664] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Created directory with path [datastore1] vmware_temp/cf6b3bfc-f5e3-4945-b38c-df9b089dd44b/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2340.992856] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Fetch image to [datastore1] vmware_temp/cf6b3bfc-f5e3-4945-b38c-df9b089dd44b/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2340.993035] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/cf6b3bfc-f5e3-4945-b38c-df9b089dd44b/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2340.994104] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b21e45b7-91d1-46f0-a0b0-13a91e3024f2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.999897] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-befb723b-0952-4894-acc7-1e0b5bcebad4 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.008581] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5586d59b-c9d4-48f8-82b0-9008f2949711 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.040709] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82e2a4b3-13c5-432c-a745-a990e9529bda {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.047179] env[69648]: DEBUG oslo_vmware.api [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Task: {'id': task-3466708, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.089569} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2341.048558] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2341.048751] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2341.048925] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2341.049114] env[69648]: INFO nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Took 0.59 seconds to destroy the instance on the hypervisor. [ 2341.050836] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0dbeb032-9047-41c9-996c-5968b93b8037 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.052689] env[69648]: DEBUG nova.compute.claims [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2341.052856] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2341.053080] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2341.075271] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2341.125474] env[69648]: DEBUG oslo_vmware.rw_handles [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf6b3bfc-f5e3-4945-b38c-df9b089dd44b/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2341.186036] env[69648]: DEBUG oslo_vmware.rw_handles [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2341.186234] env[69648]: DEBUG oslo_vmware.rw_handles [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf6b3bfc-f5e3-4945-b38c-df9b089dd44b/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2341.224168] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4062e83-4792-4067-856b-875fad441b15 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.232572] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af6bf281-8a8d-4dbb-bb19-4612c47cbdd6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.261396] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-136a4b16-1164-47aa-acfd-5395e9c723a5 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.268043] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0a9cd83-6da0-4e07-8d3a-360a31cbac4f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.280411] env[69648]: DEBUG nova.compute.provider_tree [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2341.288998] env[69648]: DEBUG nova.scheduler.client.report [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2341.301731] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.249s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2341.302275] env[69648]: ERROR nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2341.302275] env[69648]: Faults: ['InvalidArgument'] [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Traceback (most recent call last): [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] self.driver.spawn(context, instance, image_meta, [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] self._fetch_image_if_missing(context, vi) [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] image_cache(vi, tmp_image_ds_loc) [ 2341.302275] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] vm_util.copy_virtual_disk( [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] session._wait_for_task(vmdk_copy_task) [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] return self.wait_for_task(task_ref) [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] return evt.wait() [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] result = hub.switch() [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] return self.greenlet.switch() [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2341.302662] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] self.f(*self.args, **self.kw) [ 2341.303019] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2341.303019] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] raise exceptions.translate_fault(task_info.error) [ 2341.303019] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2341.303019] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Faults: ['InvalidArgument'] [ 2341.303019] env[69648]: ERROR nova.compute.manager [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] [ 2341.303019] env[69648]: DEBUG nova.compute.utils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2341.304306] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Build of instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 was re-scheduled: A specified parameter was not correct: fileType [ 2341.304306] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2341.304677] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2341.304849] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2341.305025] env[69648]: DEBUG nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2341.305228] env[69648]: DEBUG nova.network.neutron [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2341.574694] env[69648]: DEBUG nova.network.neutron [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2341.587667] env[69648]: INFO nova.compute.manager [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Took 0.28 seconds to deallocate network for instance. [ 2341.678716] env[69648]: INFO nova.scheduler.client.report [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Deleted allocations for instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 [ 2341.703054] env[69648]: DEBUG oslo_concurrency.lockutils [None req-04138df1-4171-49a3-b15d-bb29634b1abd tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 547.292s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2341.703353] env[69648]: DEBUG oslo_concurrency.lockutils [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 350.940s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2341.703560] env[69648]: DEBUG oslo_concurrency.lockutils [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2341.703782] env[69648]: DEBUG oslo_concurrency.lockutils [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2341.704096] env[69648]: DEBUG oslo_concurrency.lockutils [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2341.706230] env[69648]: INFO nova.compute.manager [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Terminating instance [ 2341.708134] env[69648]: DEBUG nova.compute.manager [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2341.708337] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2341.709022] env[69648]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-901fba85-74e8-44b7-abe7-fcfa576a1c1a {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.718751] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1750a908-ce21-48bd-94de-548acfa57bc8 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.746215] env[69648]: WARNING nova.virt.vmwareapi.vmops [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f13b5f54-2f87-4c7a-9751-4dc5b7762b83 could not be found. [ 2341.746430] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2341.746636] env[69648]: INFO nova.compute.manager [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2341.746882] env[69648]: DEBUG oslo.service.loopingcall [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=69648) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2341.747356] env[69648]: DEBUG nova.compute.manager [-] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2341.747462] env[69648]: DEBUG nova.network.neutron [-] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2341.774594] env[69648]: DEBUG nova.network.neutron [-] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Updating instance_info_cache with network_info: [] {{(pid=69648) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2341.782565] env[69648]: INFO nova.compute.manager [-] [instance: f13b5f54-2f87-4c7a-9751-4dc5b7762b83] Took 0.04 seconds to deallocate network for instance. [ 2341.862622] env[69648]: DEBUG oslo_concurrency.lockutils [None req-042499fc-dbcd-4a91-9f5a-cfdc6efb7971 tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Lock "f13b5f54-2f87-4c7a-9751-4dc5b7762b83" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2345.065992] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2346.060348] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2346.064963] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2348.066021] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2349.060836] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2350.065517] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2353.065627] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2353.065936] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=69648) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2354.065617] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2354.065805] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances with incomplete migration {{(pid=69648) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 2356.073678] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager.update_available_resource {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2356.085857] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2356.086105] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2356.086381] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2356.086448] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=69648) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2356.087594] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f691e2be-e54d-4d44-901e-7b7d3ad2bbeb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2356.096466] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e73c8039-982e-4599-8e0e-a4af26005294 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2356.110082] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0237453e-2810-4c60-aa62-ad8856afe486 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2356.116353] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-158950f4-f666-4f02-ba32-189fec717afb {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2356.146101] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180941MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=69648) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2356.146245] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2356.146499] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2356.282782] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2356.282956] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance cc77a95f-ea00-4b01-96ac-8256672eeb39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2356.283112] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 6f2b5030-4606-4873-a80b-186b841cc7dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2356.283244] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 349fb4bd-6187-4914-8322-082865bc5562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2356.283369] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 70632638-9c26-4c7b-a01e-9fa13edd409a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2356.283491] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Instance 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=69648) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2356.283687] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2356.283825] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=69648) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2356.305438] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing inventories for resource provider d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2356.319272] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating ProviderTree inventory for provider d38a352b-7808-44da-8216-792e96aadc88 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2356.319468] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Updating inventory in ProviderTree for provider d38a352b-7808-44da-8216-792e96aadc88 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2356.330914] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing aggregate associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, aggregates: None {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2356.348447] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Refreshing trait associations for resource provider d38a352b-7808-44da-8216-792e96aadc88, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=69648) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2356.423155] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dad954c9-3cb7-41fb-9f28-4a0b4d4d4606 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2356.430985] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd846daa-0b4a-4eae-aa23-6e04aa368b19 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2356.461349] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5695bced-f6c5-4b95-87ae-bd5c834af2c1 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2356.468739] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9155890-4ee2-425a-8cad-baee58d21d2b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2356.482573] env[69648]: DEBUG nova.compute.provider_tree [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2356.491274] env[69648]: DEBUG nova.scheduler.client.report [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2356.505120] env[69648]: DEBUG nova.compute.resource_tracker [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=69648) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2356.505334] env[69648]: DEBUG oslo_concurrency.lockutils [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.359s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2357.497586] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2359.066073] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2359.066073] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Starting heal instance info cache {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2359.066073] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Rebuilding the list of instances to heal {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2359.082498] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2359.082686] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2359.082776] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 6f2b5030-4606-4873-a80b-186b841cc7dd] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2359.082901] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 349fb4bd-6187-4914-8322-082865bc5562] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2359.083037] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 70632638-9c26-4c7b-a01e-9fa13edd409a] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2359.083165] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] [instance: 85f7d10c-ddc9-4e9f-9462-7e67447bc8d6] Skipping network cache update for instance because it is Building. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2359.083289] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Didn't find any instances for network info cache update. {{(pid=69648) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2361.065714] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2361.066064] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Cleaning up deleted instances {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 2361.075293] env[69648]: DEBUG nova.compute.manager [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] There are 0 instances to clean {{(pid=69648) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 2374.013522] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2374.013522] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Getting list of instances from cluster (obj){ [ 2374.013522] env[69648]: value = "domain-c8" [ 2374.013522] env[69648]: _type = "ClusterComputeResource" [ 2374.013522] env[69648]: } {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2374.013522] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bebcd86-b2ab-4fd6-831d-8abd2a46de51 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2374.027250] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Got total of 6 instances {{(pid=69648) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2377.065850] env[69648]: DEBUG oslo_service.periodic_task [None req-b94a2fd1-144a-4ca4-a231-41ee3847b852 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=69648) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2383.203043] env[69648]: DEBUG oslo_concurrency.lockutils [None req-e060bfdb-013e-4885-9964-90571d2a41fe tempest-ServerDiskConfigTestJSON-1025042355 tempest-ServerDiskConfigTestJSON-1025042355-project-member] Acquiring lock "349fb4bd-6187-4914-8322-082865bc5562" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2385.039139] env[69648]: DEBUG oslo_concurrency.lockutils [None req-82e5a3bf-a851-4765-82f8-e1f4cde10950 tempest-ServerAddressesTestJSON-1926777920 tempest-ServerAddressesTestJSON-1926777920-project-member] Acquiring lock "70632638-9c26-4c7b-a01e-9fa13edd409a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2389.323629] env[69648]: WARNING oslo_vmware.rw_handles [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles response.begin() [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2389.323629] env[69648]: ERROR oslo_vmware.rw_handles [ 2389.324407] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Downloaded image file data b010aefa-553b-437c-bd1e-78b0a276a491 to vmware_temp/cf6b3bfc-f5e3-4945-b38c-df9b089dd44b/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2389.326153] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Caching image {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2389.326396] env[69648]: DEBUG nova.virt.vmwareapi.vm_util [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Copying Virtual Disk [datastore1] vmware_temp/cf6b3bfc-f5e3-4945-b38c-df9b089dd44b/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk to [datastore1] vmware_temp/cf6b3bfc-f5e3-4945-b38c-df9b089dd44b/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk {{(pid=69648) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2389.326674] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6b812d59-e2f0-4f8a-a061-7eee6baea5c3 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2389.335526] env[69648]: DEBUG oslo_vmware.api [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for the task: (returnval){ [ 2389.335526] env[69648]: value = "task-3466709" [ 2389.335526] env[69648]: _type = "Task" [ 2389.335526] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2389.343488] env[69648]: DEBUG oslo_vmware.api [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': task-3466709, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2389.846586] env[69648]: DEBUG oslo_vmware.exceptions [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Fault InvalidArgument not matched. {{(pid=69648) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2389.846994] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2389.847644] env[69648]: ERROR nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2389.847644] env[69648]: Faults: ['InvalidArgument'] [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Traceback (most recent call last): [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] yield resources [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] self.driver.spawn(context, instance, image_meta, [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] self._fetch_image_if_missing(context, vi) [ 2389.847644] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] image_cache(vi, tmp_image_ds_loc) [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] vm_util.copy_virtual_disk( [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] session._wait_for_task(vmdk_copy_task) [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] return self.wait_for_task(task_ref) [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] return evt.wait() [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] result = hub.switch() [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2389.848247] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] return self.greenlet.switch() [ 2389.848793] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2389.848793] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] self.f(*self.args, **self.kw) [ 2389.848793] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2389.848793] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] raise exceptions.translate_fault(task_info.error) [ 2389.848793] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2389.848793] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Faults: ['InvalidArgument'] [ 2389.848793] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] [ 2389.848793] env[69648]: INFO nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Terminating instance [ 2389.850177] env[69648]: DEBUG oslo_concurrency.lockutils [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b010aefa-553b-437c-bd1e-78b0a276a491/b010aefa-553b-437c-bd1e-78b0a276a491.vmdk" {{(pid=69648) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2389.850520] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2389.850801] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-649606b8-9fe6-410d-9303-699e2ed8665c {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2389.854383] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Start destroying the instance on the hypervisor. {{(pid=69648) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2389.854653] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Destroying instance {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2389.855500] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb93a14a-9a7a-43eb-8bd4-c856dd2f7c48 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2389.859082] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2389.859340] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=69648) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2389.860368] env[69648]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ca1e0595-dece-4113-8215-a02c5724a04e {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2389.864225] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Unregistering the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2389.864768] env[69648]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-deec4ac0-5376-42da-94c5-79cb48109dc0 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2389.867237] env[69648]: DEBUG oslo_vmware.api [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Waiting for the task: (returnval){ [ 2389.867237] env[69648]: value = "session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52e2711c-c388-c1ee-dcd0-4f984e71009f" [ 2389.867237] env[69648]: _type = "Task" [ 2389.867237] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2389.874665] env[69648]: DEBUG oslo_vmware.api [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Task: {'id': session[52c50b56-0fc2-f6d2-d009-ab96e41ac7cd]52e2711c-c388-c1ee-dcd0-4f984e71009f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2389.935349] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Unregistered the VM {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2389.935556] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Deleting contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2389.935776] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Deleting the datastore file [datastore1] cb6b7f04-1c44-4998-bd28-8a01c4b235e8 {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2389.936080] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-894717d6-9696-47a7-989a-6707ac1a1e80 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2389.942174] env[69648]: DEBUG oslo_vmware.api [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Waiting for the task: (returnval){ [ 2389.942174] env[69648]: value = "task-3466711" [ 2389.942174] env[69648]: _type = "Task" [ 2389.942174] env[69648]: } to complete. {{(pid=69648) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2389.950491] env[69648]: DEBUG oslo_vmware.api [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': task-3466711, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2390.378488] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Preparing fetch location {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2390.378894] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Creating directory with path [datastore1] vmware_temp/2f27fa14-0e1a-49a7-81a9-fe01c69f4541/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2390.379110] env[69648]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e29a07c8-b9f1-49fe-be00-39ce72cc8760 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.390840] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Created directory with path [datastore1] vmware_temp/2f27fa14-0e1a-49a7-81a9-fe01c69f4541/b010aefa-553b-437c-bd1e-78b0a276a491 {{(pid=69648) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2390.391045] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Fetch image to [datastore1] vmware_temp/2f27fa14-0e1a-49a7-81a9-fe01c69f4541/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk {{(pid=69648) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2390.391227] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to [datastore1] vmware_temp/2f27fa14-0e1a-49a7-81a9-fe01c69f4541/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk on the data store datastore1 {{(pid=69648) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2390.391932] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e51a82af-a5db-4c7c-b0eb-099915c4f5cd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.398737] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42d1bf71-1f04-4d83-a112-211c8bf45ac6 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.408164] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f73e036e-ac17-4d00-8fb7-288ed5554c3f {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.439091] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddafe24c-7ebf-4e1e-b34b-64d71d40339b {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.447000] env[69648]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5907ae59-f26c-428a-ab5c-7b371fe46ddd {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.451169] env[69648]: DEBUG oslo_vmware.api [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Task: {'id': task-3466711, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080175} completed successfully. {{(pid=69648) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2390.451632] env[69648]: DEBUG nova.virt.vmwareapi.ds_util [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Deleted the datastore file {{(pid=69648) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2390.451815] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Deleted contents of the VM from datastore datastore1 {{(pid=69648) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2390.451983] env[69648]: DEBUG nova.virt.vmwareapi.vmops [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Instance destroyed {{(pid=69648) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2390.452158] env[69648]: INFO nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2390.454259] env[69648]: DEBUG nova.compute.claims [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Aborting claim: {{(pid=69648) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2390.454437] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2390.454647] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2390.469901] env[69648]: DEBUG nova.virt.vmwareapi.images [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] [instance: cc77a95f-ea00-4b01-96ac-8256672eeb39] Downloading image file data b010aefa-553b-437c-bd1e-78b0a276a491 to the data store datastore1 {{(pid=69648) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2390.611404] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51e80321-2fd2-417c-8f5b-66d3ec208294 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.615120] env[69648]: DEBUG oslo_vmware.rw_handles [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f27fa14-0e1a-49a7-81a9-fe01c69f4541/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2390.672235] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78aa7085-93d5-49df-8bbb-942fd4aa30a2 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.677454] env[69648]: DEBUG oslo_vmware.rw_handles [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Completed reading data from the image iterator. {{(pid=69648) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2390.677660] env[69648]: DEBUG oslo_vmware.rw_handles [None req-020890b6-304a-4534-a201-e466f3efbffd tempest-ServersTestJSON-1251889119 tempest-ServersTestJSON-1251889119-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f27fa14-0e1a-49a7-81a9-fe01c69f4541/b010aefa-553b-437c-bd1e-78b0a276a491/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=69648) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2390.703020] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-753e3071-80b9-48ff-97a5-0bcdce481260 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.710012] env[69648]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f26a15f1-79e4-4308-95ff-5578d14a4393 {{(pid=69648) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.722911] env[69648]: DEBUG nova.compute.provider_tree [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed in ProviderTree for provider: d38a352b-7808-44da-8216-792e96aadc88 {{(pid=69648) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2390.733562] env[69648]: DEBUG nova.scheduler.client.report [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Inventory has not changed for provider d38a352b-7808-44da-8216-792e96aadc88 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=69648) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2390.750286] env[69648]: DEBUG oslo_concurrency.lockutils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.296s {{(pid=69648) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2390.750799] env[69648]: ERROR nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2390.750799] env[69648]: Faults: ['InvalidArgument'] [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Traceback (most recent call last): [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] self.driver.spawn(context, instance, image_meta, [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] self._fetch_image_if_missing(context, vi) [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] image_cache(vi, tmp_image_ds_loc) [ 2390.750799] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] vm_util.copy_virtual_disk( [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] session._wait_for_task(vmdk_copy_task) [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] return self.wait_for_task(task_ref) [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] return evt.wait() [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] result = hub.switch() [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] return self.greenlet.switch() [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2390.751136] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] self.f(*self.args, **self.kw) [ 2390.751428] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2390.751428] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] raise exceptions.translate_fault(task_info.error) [ 2390.751428] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2390.751428] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Faults: ['InvalidArgument'] [ 2390.751428] env[69648]: ERROR nova.compute.manager [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] [ 2390.751546] env[69648]: DEBUG nova.compute.utils [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] VimFaultException {{(pid=69648) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2390.752893] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Build of instance cb6b7f04-1c44-4998-bd28-8a01c4b235e8 was re-scheduled: A specified parameter was not correct: fileType [ 2390.752893] env[69648]: Faults: ['InvalidArgument'] {{(pid=69648) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2390.753285] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Unplugging VIFs for instance {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2390.753465] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=69648) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2390.753639] env[69648]: DEBUG nova.compute.manager [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] Deallocating network for instance {{(pid=69648) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2390.753795] env[69648]: DEBUG nova.network.neutron [None req-42141231-1516-4790-a23d-5a0e6a9718b3 tempest-DeleteServersTestJSON-579077004 tempest-DeleteServersTestJSON-579077004-project-member] [instance: cb6b7f04-1c44-4998-bd28-8a01c4b235e8] deallocate_for_instance() {{(pid=69648) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}}